* Package: dev-python/msrest-0.7.1:0 * Repository: guru * Maintainer: cyber+gentoo@sysrq.in * USE: abi_x86_64 amd64 elibc_glibc kernel_linux python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 test userland_GNU * FEATURES: network-sandbox preserve-libs sandbox test userpriv usersandbox @@@@@ PLEASE PAY ATTENTION HERE!!! @@@@@ This information may help you to understand if this is a duplicate or if this bug exists after you pushed a fix; This ebuild was merged at the following commit: https://github.com/gentoo/gentoo/commit/1b008898d1f8eb2d6c3c4b9eee35f9b1f6237b77 (Fri Nov 4 21:54:38 UTC 2022) @@@@@ END @@@@@ @@@@@ PLEASE PAY ATTENTION HERE!!! @@@@@ This information may help you to understand if this is a duplicate or if this bug exists after you pushed a fix; This ebuild was merged at the following commit: https://gitweb.gentoo.org/repo/proj/guru.git/commit/?id=75464915fc7431e173764ebf602143a8d5b12e7e (Fri Nov 4 14:10:25 UTC 2022) @@@@@ END @@@@@ @@@@@ PLEASE PAY ATTENTION HERE!!! @@@@@ This ebuild was merged (directly or as a dependency) because of the following commit: https://gitweb.gentoo.org/repo/proj/guru.git/commit/?id=ae39da72ee3df4fd46e50a10d7c39cf36567cfaf @@@@@ END @@@@@ ################## # emerge --info: # ################## Portage 3.0.38.1 (python 3.10.8-final-0, default/linux/amd64/17.1, gcc-12, glibc-2.36-r5, 5.10.133-gentoo x86_64) ================================================================= System uname: Linux-5.10.133-gentoo-x86_64-Intel-R-_Xeon-R-_CPU_E5-2650_v4_@_2.20GHz-with-glibc2.36 KiB Mem: 264024340 total, 51966128 free KiB Swap: 0 total, 0 free sh bash 5.1_p16-r2 ld GNU ld (Gentoo 2.39 p5) 2.39.0 app-misc/pax-utils: 1.3.5::gentoo app-shells/bash: 5.1_p16-r2::gentoo dev-lang/perl: 5.36.0-r1::gentoo dev-lang/python: 2.7.18_p16::gentoo, 3.8.15_p2::gentoo, 3.9.15_p2::gentoo, 3.10.8_p2::gentoo, 3.11.0_p1::gentoo dev-lang/rust: 1.64.0-r1::gentoo dev-util/cmake: 3.24.3::gentoo dev-util/meson: 0.63.3::gentoo sys-apps/baselayout: 2.9::gentoo sys-apps/openrc: 0.45.2-r1::gentoo sys-apps/sandbox: 2.29::gentoo sys-devel/autoconf: 2.71-r4::gentoo sys-devel/automake: 1.16.5::gentoo sys-devel/binutils: 2.39-r4::gentoo sys-devel/binutils-config: 5.4.1::gentoo sys-devel/gcc: 12.2.1_p20221008::gentoo sys-devel/gcc-config: 2.8::gentoo sys-devel/libtool: 2.4.7::gentoo sys-devel/make: 4.4::gentoo sys-kernel/linux-headers: 6.0::gentoo (virtual/os-headers) sys-libs/glibc: 2.36-r5::gentoo Repositories: gentoo location: /usr/portage sync-type: rsync sync-uri: rsync://rsync.gentoo.org/gentoo-portage priority: -1000 sync-rsync-verify-jobs: 1 sync-rsync-verify-metamanifest: yes sync-rsync-extra-opts: sync-rsync-verify-max-age: 24 guru location: /opt/guru masters: gentoo priority: 0 ACCEPT_KEYWORDS="amd64 ~amd64" ACCEPT_LICENSE="* MIT" CBUILD="x86_64-pc-linux-gnu" CFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" CHOST="x86_64-pc-linux-gnu" CONFIG_PROTECT="/etc /usr/share/gnupg/qualified.txt" CONFIG_PROTECT_MASK="/etc/ca-certificates.conf /etc/env.d /etc/fonts/fonts.conf /etc/gconf /etc/gentoo-release /etc/revdep-rebuild /etc/sandbox.d /etc/terminfo" CXXFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" DISTDIR="/var/tmp/portage/dev-python/msrest-0.7.1/distdir" EMERGE_DEFAULT_OPTS="--with-bdeps=y -1 -k -b" ENV_UNSET="CARGO_HOME DBUS_SESSION_BUS_ADDRESS DISPLAY GOBIN GOPATH PERL5LIB PERL5OPT PERLPREFIX PERL_CORE PERL_MB_OPT PERL_MM_OPT XAUTHORITY XDG_CACHE_HOME XDG_CONFIG_HOME XDG_DATA_HOME XDG_RUNTIME_DIR XDG_STATE_HOME" FCFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" FEATURES="assume-digests binpkg-docompress binpkg-dostrip binpkg-logs binpkg-multi-instance buildpkg buildpkg-live config-protect-if-modified distlocks ebuild-locks fixlafiles ipc-sandbox merge-sync multilib-strict network-sandbox news parallel-fetch pid-sandbox preserve-libs protect-owned qa-unresolved-soname-deps sandbox sfperms sign split-log strict test unknown-features-warn unmerge-logs unmerge-orphans userfetch userpriv usersandbox usersync xattr" FFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" GENTOO_MIRRORS="http://mirror.leaseweb.com/gentoo/ http://ftp.snt.utwente.nl/pub/os/linux/gentoo/ http://ftp.belnet.be/pub/rsync.gentoo.org/gentoo/ http://distfiles.gentoo.org" LANG="C.UTF8" LDFLAGS="-Wl,-O1 -Wl,--as-needed -Wl,--defsym=__gentoo_check_ldflags__=0" MAKEOPTS="-j46" PKGDIR="/root/tbci/binpkg" PORTAGE_CONFIGROOT="/" PORTAGE_RSYNC_OPTS="--recursive --links --safe-links --perms --times --omit-dir-times --compress --force --whole-file --delete --stats --human-readable --timeout=180 --exclude=/distfiles --exclude=/local --exclude=/packages --exclude=/.git" PORTAGE_TMPDIR="/var/tmp" SHELL="/bin/bash" USE="acl amd64 bzip2 cli crypt dri elogind fortran gdbm iconv ipv6 jumbo-build libglvnd libtirpc multilib native-symlinks ncurses nls nptl openmp pam pcre readline seccomp split-usr ssl test test-rust unicode xattr zlib" ABI_X86="64" ELIBC="glibc" KERNEL="linux" PYTHON_TARGETS="python3_8 python3_9 python3_10" USERLAND="GNU" Unset: ADDR2LINE, AR, ARFLAGS, AS, ASFLAGS, CC, CCLD, CONFIG_SHELL, CPP, CPPFLAGS, CTARGET, CXX, CXXFILT, ELFEDIT, EXTRA_ECONF, F77FLAGS, FC, GCOV, GPROF, INSTALL_MASK, LC_ALL, LD, LEX, LFLAGS, LIBTOOL, LINGUAS, MAKE, MAKEFLAGS, NM, OBJCOPY, OBJDUMP, PORTAGE_BINHOST, PORTAGE_BUNZIP2_COMMAND, PORTAGE_COMPRESS, PORTAGE_COMPRESS_FLAGS, PORTAGE_RSYNC_EXTRA_OPTS, RANLIB, READELF, RUSTFLAGS, SIZE, STRINGS, STRIP, YACC, YFLAGS ############################## # emerge history (qlop -mv): # ############################## 2022-11-05T03:32:22 >>> dev-python/pluggy-1.0.0-r2 2022-11-05T03:32:19 >>> sys-apps/lsb-release-3.2 2022-11-05T03:32:17 >>> app-eselect/eselect-rust-20210703 2022-11-05T03:32:21 >>> dev-python/iniconfig-1.1.1-r1 2022-11-05T03:32:20 >>> dev-python/exceptiongroup-1.0.1 2022-11-05T03:32:24 >>> dev-python/sniffio-1.3.0 2022-11-05T03:32:23 >>> dev-python/zope-interface-5.5.1 2022-11-05T03:32:24 >>> dev-python/async_generator-1.10-r2 2022-11-05T03:32:29 >>> dev-python/semantic_version-2.10.0 2022-11-05T03:32:27 >>> dev-python/sortedcontainers-2.4.0-r1 2022-11-05T03:32:30 >>> dev-python/ply-3.11-r2 2022-11-05T03:32:32 >>> dev-python/async-timeout-4.0.2-r1 2022-11-05T03:32:51 >>> dev-lang/rust-1.64.0-r1 2022-11-05T03:32:33 >>> dev-python/frozenlist-1.3.1 2022-11-05T03:32:36 >>> dev-python/httpretty-1.1.4-r1 2022-11-05T03:32:35 >>> dev-python/isodate-0.6.1-r1 2022-11-05T03:32:34 >>> dev-python/multidict-6.0.2 2022-11-05T03:33:34 >>> dev-python/attrs-22.1.0 2022-11-05T03:33:49 >>> dev-python/flit_scm-1.7.0 2022-11-05T03:32:56 >>> dev-python/pyjwt-2.6.0 2022-11-05T03:32:57 >>> dev-python/blinker-1.5 2022-11-05T03:33:54 >>> dev-python/pycparser-2.21-r1 2022-11-05T03:34:01 >>> dev-python/azure-core-1.26.1 2022-11-05T03:34:16 >>> virtual/rust-1.64.0 2022-11-05T03:34:24 >>> dev-python/aiosignal-1.2.0-r1 2022-11-05T03:34:54 >>> dev-python/outcome-1.2.0 2022-11-05T03:35:02 >>> dev-python/pytest-7.2.0 2022-11-05T03:35:20 >>> dev-python/cffi-1.15.1 2022-11-05T03:35:31 >>> dev-python/setuptools-rust-1.5.2 2022-11-05T03:35:47 >>> dev-python/trio-0.22.0 2022-11-05T03:36:00 >>> virtual/python-cffi-1 2022-11-05T03:36:05 >>> dev-python/cryptography-38.0.3 2022-11-05T03:34:46 >>> dev-python/yarl-1.8.1 2022-11-05T03:33:03 >>> app-arch/brotli-1.0.9-r5 2022-11-05T03:36:27 >>> dev-python/oauthlib-3.2.2 2022-11-05T03:36:21 >>> dev-python/pycares-4.2.2 2022-11-05T03:37:04 >>> dev-python/requests-oauthlib-1.3.1 2022-11-05T03:37:09 >>> dev-python/aiodns-3.0.0-r1 2022-11-05T03:36:45 >>> dev-python/aiohttp-3.8.3-r1 ####################################### # installed packages (qlist -ICvUSS): # ####################################### acct-group/audio-0-r1:0 acct-group/cdrom-0-r1:0 acct-group/dialout-0-r1:0 acct-group/disk-0-r1:0 acct-group/input-0-r1:0 acct-group/kmem-0-r1:0 acct-group/kvm-0-r1:0 acct-group/lp-0-r1:0 acct-group/man-0-r1:0 acct-group/messagebus-0-r1:0 acct-group/polkitd-0-r1:0 acct-group/portage-0:0 acct-group/render-0-r1:0 acct-group/sgx-0:0 acct-group/sshd-0-r1:0 acct-group/tape-0-r1:0 acct-group/tty-0-r1:0 acct-group/video-0-r1:0 acct-user/man-1-r1:0 acct-user/messagebus-0-r1:0 acct-user/polkitd-0-r1:0 acct-user/portage-0:0 acct-user/sshd-0-r1:0 app-admin/eselect-1.4.20:0 -doc -emacs -vim-syntax app-admin/perl-cleaner-2.30:0 app-arch/brotli-1.0.9-r5:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 python python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -static-libs -test app-arch/bzip2-1.0.8-r3:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 split-usr -static -static-libs -verify-sig app-arch/gzip-1.12-r2:0 -pic -static -verify-sig app-arch/libarchive-3.6.1:0/13 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 acl -blake2 bzip2 e2fsprogs -expat iconv -lz4 lzma -lzo -nettle -static-libs -verify-sig xattr -zstd app-arch/tar-1.34-r1:0 acl -minimal nls -selinux -verify-sig xattr app-arch/unzip-6.0_p27-r1:0 bzip2 -natspec unicode app-arch/xz-utils-5.2.7:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 extra-filters nls split-usr -static-libs -verify-sig app-arch/zstd-1.5.2-r3:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -lz4 split-usr -static-libs app-crypt/gnupg-2.3.8:0 bzip2 -doc -ldap nls readline -selinux smartcard ssl -test tofu -tools -tpm -usb -user-socket -verify-sig -wks-server app-crypt/gpgme-1.18.0-r2:1/11.6.15.1 -common-lisp cxx -python python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -qt5 -static-libs -test -verify-sig app-crypt/libb2-0.98.1-r3:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -native-cflags openmp -static-libs app-crypt/libmd-1.0.4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 app-crypt/pinentry-1.2.1-r1:0 -caps -efl -emacs -gnome-keyring -gtk ncurses -qt5 -verify-sig app-crypt/rhash-1.4.3:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug nls ssl -static-libs app-editors/nano-6.4:0 -debug -justify -magic -minimal ncurses nls spell split-usr -static unicode app-eselect/eselect-fontconfig-20220403:0 app-eselect/eselect-iptables-20220320:0 app-eselect/eselect-lib-bin-symlink-0.1.1-r1:0 app-eselect/eselect-pinentry-0.7.2:0 app-eselect/eselect-rust-20210703:0 app-i18n/man-pages-ja-20180315-r1:0 app-i18n/man-pages-l10n-4.14.0-r1:0 l10n_cs l10n_da l10n_de l10n_el l10n_es l10n_fi l10n_fr l10n_hu l10n_id l10n_it l10n_mk l10n_nb l10n_nl l10n_pl l10n_pt-BR l10n_ro l10n_sr l10n_sv l10n_uk l10n_vi app-i18n/man-pages-ru-5.03.2390.2390.20191017-r1:0 app-i18n/man-pages-zh_CN-1.6.3.6:0 app-misc/c_rehash-1.7-r1:0 app-misc/ca-certificates-20211016.3.83:0 -cacert app-misc/editor-wrapper-4-r1:0 app-misc/mime-types-2.1.53:0 -nginx app-misc/pax-utils-1.3.5:0 -caps man -python python_single_target_python3_10 -python_single_target_python3_11 -python_single_target_python3_8 -python_single_target_python3_9 seccomp -test app-misc/tmux-3.3a-r1:0 -debug -selinux -systemd -utempter -vim-syntax app-portage/eix-0.36.5:0 -debug -doc nls -sqlite app-portage/elt-patches-20220831:0 app-portage/gemato-17.0:0 gpg -pretty-log python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test -tools app-portage/gentoolkit-0.6.1-r3:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test app-portage/portage-utils-0.94.3:0 openmp qmanifest qtegrity -static app-shells/bash-5.1_p16-r2:0 -afs -bashlogger -examples -mem-scramble net nls -plugins readline -verify-sig app-shells/bash-completion-2.11:0 eselect -test app-shells/gentoo-bashcomp-20190211-r1:0 app-shells/push-3.4:0 app-shells/quoter-4.2:0 app-text/ansifilter-2.18:0 -qt5 app-text/build-docbook-catalog-2.3-r1:0 app-text/docbook-xml-dtd-4.5-r2:4.5 app-text/docbook-xml-dtd-4.4-r3:4.4 app-text/docbook-xml-dtd-4.2-r3:4.2 app-text/docbook-xml-dtd-4.1.2-r7:4.1.2 app-text/docbook-xsl-stylesheets-1.79.1-r3:0 -ruby app-text/manpager-1:0 app-text/opensp-1.5.2-r7:0 -doc nls -static-libs -test app-text/po4a-0.68:0 -test -test app-text/sgml-common-0.6.3-r7:0 app-text/xmlto-0.0.28-r9:0 -latex text dev-db/sqlite-3.39.4:3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -doc -icu readline -secure-delete -static-libs -tcl -test -tools dev-lang/duktape-2.7.0-r3:0/2.7.0 dev-lang/perl-5.36.0-r1:0/5.36 -berkdb -debug -doc gdbm ithreads -minimal -quadmath dev-lang/python-3.11.0_p1:3.11 -bluetooth -build ensurepip -examples gdbm -hardened -libedit -lto ncurses -pgo readline sqlite ssl -test -tk -verify-sig dev-lang/python-3.10.8_p2:3.10 -bluetooth -build ensurepip -examples gdbm -hardened -libedit -lto ncurses -pgo readline sqlite ssl -test -tk -verify-sig xml dev-lang/python-3.9.15_p2:3.9 -bluetooth -build ensurepip -examples gdbm -hardened -lto ncurses -pgo readline sqlite ssl -test -tk -verify-sig xml dev-lang/python-3.8.15_p2:3.8 -bluetooth -build ensurepip -examples gdbm -hardened -lto ncurses -pgo readline sqlite ssl -test -tk -verify-sig -wininst xml dev-lang/python-2.7.18_p16:2.7 -berkdb -bluetooth -build -examples gdbm -hardened ncurses readline sqlite ssl -tk -verify-sig -wininst xml dev-lang/python-exec-2.4.9:2 native-symlinks python_targets_pypy3 python_targets_python3_10 python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-lang/python-exec-conf-2.4.6:2 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 dev-lang/rust-1.64.0-r1:stable/1.64 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -clippy cpu_flags_x86_sse2 -debug -dist -doc -llvm-libunwind -llvm_targets_AArch64 -llvm_targets_AMDGPU -llvm_targets_ARM -llvm_targets_AVR -llvm_targets_BPF -llvm_targets_Hexagon -llvm_targets_Lanai -llvm_targets_MSP430 -llvm_targets_Mips -llvm_targets_NVPTX -llvm_targets_PowerPC -llvm_targets_RISCV -llvm_targets_Sparc -llvm_targets_SystemZ -llvm_targets_WebAssembly llvm_targets_X86 -llvm_targets_XCore -miri nightly -parallel-compiler -profiler -rls -rust-analyzer -rust-src rustfmt -system-bootstrap -system-llvm -test -verify-sig -wasm dev-lang/tcl-8.6.12-r1:0/8.6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug threads dev-libs/boehm-gc-8.2.2-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cxx large -static-libs threads dev-libs/elfutils-0.188:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 -lzma nls -static-libs -test utils -valgrind -verify-sig -zstd dev-libs/expat-2.5.0:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -examples -static-libs unicode dev-libs/glib-2.74.1-r1:2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -dbus -debug elf -gtk-doc mime -selinux -static-libs -sysprof -systemtap -test -utils xattr dev-libs/gmp-6.2.1-r2:0/10.4 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm cxx -doc -pic -static-libs dev-libs/gobject-introspection-1.74.0:0 -doctool -gtk-doc python_single_target_python3_10 -python_single_target_python3_11 -python_single_target_python3_8 -python_single_target_python3_9 -test dev-libs/gobject-introspection-common-1.74.0:0 dev-libs/isl-0.24-r2:0/23 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs dev-libs/jsoncpp-1.9.5:0/25 -doc -test dev-libs/libassuan-2.5.5:0 dev-libs/libatomic_ops-7.6.14:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 dev-libs/libbsd-0.11.7:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs -verify-sig dev-libs/libevent-2.1.12:0/2.1-7 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 clock-gettime -debug -malloc-replacement ssl -static-libs -test threads -verbose-debug dev-libs/libffi-3.4.4:0/8 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -exec-static-trampoline -pax-kernel -static-libs -test dev-libs/libgcrypt-1.10.1-r2:0/20 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm -cpu_flags_arm_aes -cpu_flags_arm_neon -cpu_flags_arm_sha1 -cpu_flags_arm_sha2 -cpu_flags_ppc_altivec -cpu_flags_ppc_vsx2 -cpu_flags_ppc_vsx3 cpu_flags_x86_aes cpu_flags_x86_avx cpu_flags_x86_avx2 -cpu_flags_x86_padlock -cpu_flags_x86_sha cpu_flags_x86_sse4_1 -doc -static-libs -verify-sig dev-libs/libgpg-error-1.46-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -common-lisp nls -static-libs -test -verify-sig dev-libs/libksba-1.6.2:0 -static-libs -verify-sig dev-libs/libltdl-2.4.7:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs dev-libs/libpcre2-10.40-r1:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 jit -libedit pcre16 pcre32 readline split-usr -static-libs unicode -verify-sig zlib dev-libs/libpipeline-1.5.6:0 -test dev-libs/libtasn1-4.19.0:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs -test -valgrind -verify-sig dev-libs/libunistring-1.1-r1:0/5 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -static-libs dev-libs/libuv-1.44.2-r1:0/1 dev-libs/libxml2-2.10.3:2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -examples ftp -icu -lzma python python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 readline -static-libs -test dev-libs/libxslt-1.1.37:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 crypt -debug -examples -python python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -static-libs dev-libs/lzo-2.10:2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -examples split-usr -static-libs dev-libs/mpc-1.2.1:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs dev-libs/mpfr-4.1.0_p13-r1:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs dev-libs/nettle-3.8.1:0/8-6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm -cpu_flags_arm_aes -cpu_flags_arm_neon -cpu_flags_arm_sha1 -cpu_flags_arm_sha2 -cpu_flags_ppc_altivec cpu_flags_x86_aes cpu_flags_x86_pclmul -cpu_flags_x86_sha -doc gmp -static-libs -verify-sig dev-libs/npth-1.6-r1:0 -test dev-libs/openssl-1.1.1s:0/1.1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm cpu_flags_x86_sse2 -rfc3779 -sctp -sslv3 -static-libs -test -tls-compression -tls-heartbeat -vanilla -verify-sig -verify-sig -weak-ssl-ciphers dev-libs/popt-1.19:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls -static-libs dev-perl/Devel-CheckLib-1.140.0:0 -test dev-perl/Encode-EUCJPASCII-0.30.0-r1:0 -test dev-perl/Encode-HanExtra-0.230.0-r3:0 dev-perl/Encode-Locale-1.50.0-r1:0 -test dev-perl/ExtUtils-CChecker-0.110.0:0 -test dev-perl/File-BaseDir-0.90.0:0 -test dev-perl/File-DesktopEntry-0.220.0-r1:0 -test dev-perl/File-Listing-6.150.0:0 -test -test dev-perl/File-MimeInfo-0.300.0:0 -test dev-perl/HTML-Parser-3.760.0:0 -test dev-perl/HTML-Tagset-3.200.0-r2:0 dev-perl/HTTP-Cookies-6.100.0:0 -test dev-perl/HTTP-Date-6.50.0:0 dev-perl/HTTP-Message-6.330.0:0 -test -test dev-perl/HTTP-Negotiate-6.10.0-r2:0 -test dev-perl/IO-HTML-1.4.0:0 -test dev-perl/IO-Socket-INET6-2.720.0-r2:0 -test dev-perl/IO-Socket-SSL-2.74.0:0 -examples -idn -test dev-perl/IPC-System-Simple-1.300.0:0 -test dev-perl/libwww-perl-6.600.0-r1:0 ssl -test dev-perl/Locale-gettext-1.70.0-r1:0 -test dev-perl/LWP-MediaTypes-6.40.0:0 -test dev-perl/LWP-Protocol-https-6.100.0:0 -test dev-perl/MIME-Charset-1.12.2-r1:0 l10n_ja l10n_zh -test dev-perl/Module-Build-0.423.100:0 -test dev-perl/Mozilla-CA-20999999-r1:0 -test dev-perl/Net-HTTP-6.210.0:0 -minimal -test dev-perl/Net-SSLeay-1.920.0:0 -examples -examples -minimal -test dev-perl/Pod-Parser-1.630.0-r1:0 -test dev-perl/SGMLSpm-1.1-r2:0 -test dev-perl/Socket6-0.290.0:0 -test dev-perl/Sub-Name-0.260.0:0 -suggested -test dev-perl/Syntax-Keyword-Try-0.270.0:0 -test dev-perl/TermReadKey-2.380.0:0 -examples -test dev-perl/Text-CharWidth-0.40.0-r2:0 -test dev-perl/Text-WrapI18N-0.60.0-r2:0 -test dev-perl/TimeDate-2.330.0-r1:0 -test dev-perl/Try-Tiny-0.310.0:0 -minimal -test dev-perl/Unicode-LineBreak-2019.1.0:0 dev-perl/URI-5.110.0:0 -test dev-perl/WWW-RobotRules-6.20.0-r2:0 -test dev-perl/XML-Parser-2.460.0-r2:0 dev-perl/XS-Parse-Keyword-0.250.0:0 -test dev-perl/YAML-Tiny-1.730.0-r1:0 -minimal -test dev-python/aiodns-3.0.0-r1:0 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/aiohttp-3.8.3-r1:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test test-rust dev-python/aiosignal-1.2.0-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/appdirs-1.4.4-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 dev-python/async-timeout-4.0.2-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/async_generator-1.10-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/attrs-22.1.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/autocommand-2.2.1_p20211118:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/azure-core-1.26.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/blinker-1.5:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/certifi-3021.3.16-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/cffi-1.15.1:0/1.15.1 -doc python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/charset_normalizer-3.0.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/cryptography-38.0.3:0 -debug python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/cython-0.29.32:0 -doc -emacs python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/ensurepip-pip-22.3:0 dev-python/ensurepip-setuptools-65.5.0:0 dev-python/ensurepip-wheels-100:0 dev-python/exceptiongroup-1.0.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/flit_core-3.7.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/flit_scm-1.7.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 dev-python/frozenlist-1.3.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/gpep517-9:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/httpretty-1.1.4-r1:0 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test test-rust dev-python/idna-3.4:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/importlib_metadata-5.0.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/importlib_resources-5.10.0:0 python_targets_pypy3 python_targets_python3_8 -test dev-python/inflect-6.0.2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/iniconfig-1.1.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/installer-0.5.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/isodate-0.6.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/jaraco-context-4.1.2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/jaraco-functools-3.5.2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/jaraco-text-3.10.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/jinja-3.1.2:0 -doc -examples python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/markupsafe-2.1.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/more-itertools-9.0.0:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/multidict-6.0.2:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/nspektr-0.4.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/oauthlib-3.2.2:0 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/ordered-set-4.1.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/outcome-1.2.0:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/packaging-21.3-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/pluggy-1.0.0-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/ply-3.11-r2:0/3.11 -examples python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 dev-python/pycares-4.2.2:0 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/pycparser-2.21-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/pydantic-1.10.2:0 native-extensions python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/pyjwt-2.6.0:0 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/pyparsing-3.0.9:0 -examples python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/pypy3-7.3.9_p8:0/pypy39-pp73-336 ensurepip gdbm jit ncurses -sqlite -test -tk dev-python/pypy3-exe-7.3.9_p3:3.9-7.3.9 -cpu_flags_x86_sse2 jit -low-memory ncurses dev-python/PySocks-1.7.1-r2:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 dev-python/pytest-7.2.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/requests-2.28.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -socks5 -test test-rust dev-python/requests-oauthlib-1.3.1:0 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/semantic_version-2.10.0:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/setuptools-65.5.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/setuptools-rust-1.5.2:0 -debug python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/setuptools_scm-7.0.5:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/six-1.16.0-r1:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/sniffio-1.3.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/sortedcontainers-2.4.0-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/tomli-2.0.1-r1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/trio-0.22.0:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/typing-extensions-4.3.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 dev-python/urllib3-1.26.12:0 -brotli python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/wheel-0.38.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/yarl-1.8.1:0 -doc python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/zipp-3.10.0:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-python/zope-interface-5.5.1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-util/checkbashisms-2.22.2:0 dev-util/cmake-3.24.3:0 -doc -emacs ncurses -qt5 -test -test -verify-sig dev-util/desktop-file-utils-0.26-r2:0 -emacs dev-util/glib-utils-2.74.1:0 python_single_target_python3_10 -python_single_target_python3_11 -python_single_target_python3_8 -python_single_target_python3_9 dev-util/gperf-3.1-r1:0 dev-util/gtk-doc-am-1.33.2:0 dev-util/intltool-0.51.0-r3:0 dev-util/meson-0.63.3:0 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -test dev-util/meson-format-array-0:0 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 dev-util/ninja-1.11.1-r2:0 -doc -emacs -test dev-util/pkgconf-1.8.0-r1:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -test dev-util/re2c-2.2:0 -debug -test dev-vcs/git-2.38.1:0 blksha1 -cgi curl -cvs -doc -gnome-keyring gpg -highlight iconv -mediawiki -mediawiki-experimental nls pcre -perforce -perl python_single_target_python3_10 -python_single_target_python3_8 -python_single_target_python3_9 safe-directory -selinux -subversion -test -tk webdav -xinetd media-fonts/liberation-fonts-2.1.5:0 -X -X -fontforge media-gfx/graphite2-1.3.14_p20210810-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -perl -test media-libs/fontconfig-2.14.1-r1:1.0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc nls -test media-libs/freetype-2.12.1-r1:2 -X -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 adobe-cff -brotli bzip2 cleartype-hinting -debug -doc -fontforge harfbuzz -infinality png -static-libs -svg -utils media-libs/harfbuzz-5.3.1:0/4.0.0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 cairo -debug -doc -experimental glib graphite -icu introspection -test truetype media-libs/libpng-1.6.38:0/16 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -apng -cpu_flags_arm_neon cpu_flags_x86_sse -static-libs net-dns/c-ares-1.18.1:0/2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs -test net-dns/libidn2-2.3.4:0/2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls -static-libs -verify-sig net-firewall/iptables-1.8.8-r4:0/1.8.3 -conntrack -netlink -nftables -pcap split-usr -static-libs net-libs/gnutls-3.7.8:0/30.30 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -brotli cxx -dane -doc -examples -guile idn nls openssl -pkcs11 seccomp -sslv2 -sslv3 -static-libs -test -test-full tls-heartbeat -tools -valgrind -verify-sig zlib -zstd net-libs/libmnl-1.0.5:0/0.2.0 -examples -verify-sig net-libs/libnsl-2.0.0-r1:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs net-libs/libtirpc-1.3.3:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -kerberos split-usr -static-libs net-libs/nghttp2-1.50.0:0/1.14 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cxx -debug -hpack-tools -jemalloc -static-libs -test -utils -xml net-misc/curl-7.86.0-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 adns -alt-svc -brotli -curl_ssl_gnutls -curl_ssl_mbedtls -curl_ssl_nss curl_ssl_openssl ftp -gnutls -gopher -hsts http2 -idn imap ipv6 -kerberos -ldap -mbedtls -nghttp3 -nss openssl pop3 progress-meter -quiche -rtmp -samba smtp -ssh ssl -sslv3 -static-libs -telnet -test tftp -verify-sig -websockets -zstd net-misc/dhcpcd-9.4.1:0 -debug embedded ipv6 -privsep udev net-misc/iputils-20211215:0 arping -caps -clockdiff -doc filecaps -idn nls -rarpd -rdisc -static -test -tracepath net-misc/netifrc-0.7.3-r1:0 dhcp net-misc/openssh-9.1_p1:0 -X -X509 -abi_mips_n32 -audit -debug -hpn -kerberos -ldns -libedit -livecd pam pie -sctp -security-key -selinux ssl -static -test -verify-sig -xmss net-misc/rsync-3.2.7:0 acl -examples iconv -lz4 python_single_target_python3_10 -python_single_target_python3_8 -python_single_target_python3_9 ssl -stunnel -system-zlib -verify-sig xattr -xxhash -zstd net-misc/wget-1.21.3-r1:0 -cookie-check -debug -gnutls -idn ipv6 -metalink nls -ntlm pcre ssl -static -test -uuid -verify-sig zlib perl-core/Compress-Raw-Zlib-2.202.0:0 perl-core/File-Temp-0.231.100:0 sec-keys/openpgp-keys-gentoo-release-20220101:0 -test sys-apps/acl-2.3.1-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls split-usr -static-libs sys-apps/attr-2.5.1-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug nls split-usr -static-libs sys-apps/baselayout-2.9:0 -build split-usr sys-apps/coreutils-9.1-r1:0 acl -caps -gmp -hostname -kill -multicall nls -selinux split-usr -static -test -vanilla -verify-sig xattr sys-apps/dbus-1.15.2:0 -X -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -doc elogind -selinux -static-libs -systemd -test -test sys-apps/debianutils-5.7:0 installkernel -static sys-apps/diffutils-3.8:0 nls -static -verify-sig sys-apps/file-5.43:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 -lzma -python python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 -seccomp -static-libs -verify-sig zlib sys-apps/findutils-4.9.0-r2:0 nls -selinux -static -test -verify-sig sys-apps/gawk-5.1.1-r2:0 -mpfr nls readline -verify-sig sys-apps/gentoo-functions-0.17:0 sys-apps/grep-3.8:0 nls pcre -static -verify-sig sys-apps/groff-1.22.4:0 -X -examples -uchardet sys-apps/help2man-1.49.2:0 nls sys-apps/install-xattr-0.8:0 sys-apps/iproute2-6.0.0:0 -atm -berkdb -bpf -caps -elf iptables -libbsd -minimal -nfs -selinux split-usr sys-apps/kbd-2.5.1:0 nls pam -test sys-apps/kmod-30:0 -debug -doc lzma -pkcs7 -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -static-libs tools zlib zstd sys-apps/less-608:0 pcre unicode sys-apps/lsb-release-3.2:0 sys-apps/man-db-2.11.0:0 manpager nls seccomp -selinux -static-libs zlib sys-apps/man-pages-6.01:0 l10n_cs l10n_da l10n_de l10n_el l10n_es l10n_fi l10n_fr l10n_hu l10n_id l10n_it l10n_ja l10n_mk l10n_nb l10n_nl l10n_pl l10n_pt-BR l10n_ro l10n_ru l10n_sr l10n_sv l10n_uk l10n_vi l10n_zh-CN sys-apps/miscfiles-1.5-r4:0 -minimal sys-apps/net-tools-2.10:0 arp hostname ipv6 -nis nls -plipconfig -selinux -slattach -static sys-apps/openrc-0.45.2-r1:0 -audit -bash -debug ncurses netifrc -newnet pam -selinux -sysv-utils unicode sys-apps/portage-3.0.38.1-r2:0 -apidoc -build -doc -gentoo-dev ipc native-extensions python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 rsync-verify -selinux -test xattr sys-apps/sandbox-2.29:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 abi_x86_32 abi_x86_64 -abi_x86_x32 nnp sys-apps/sed-4.8-r1:0 acl nls -selinux -static -verify-sig sys-apps/shadow-4.12.3:0/4 acl -audit -bcrypt -cracklib nls pam -selinux -skey split-usr -su -verify-sig xattr sys-apps/systemd-utils-251.7:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 acl -boot kmod -selinux split-usr -split-usr -sysusers -test tmpfiles udev sys-apps/sysvinit-3.05:0 -ibm nls -selinux -static -verify-sig sys-apps/texinfo-6.8:0 nls standalone -static sys-apps/util-linux-2.38.1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -audit -build -caps cramfs -cryptsetup -fdformat hardlink -kill logger -magic ncurses nls pam -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 readline -rtas -selinux -slang split-usr -static-libs su suid -systemd -test -tty-helpers -udev unicode -verify-sig sys-apps/which-2.21:0 sys-auth/elogind-246.10-r2:0 acl -audit cgroup-hybrid -debug -doc pam policykit -selinux -test sys-auth/pambase-20220214:0 -caps -debug elogind -gnome-keyring -homed -minimal -mktemp nullok -pam_krb5 -pam_ssh passwdqc -pwhistory -pwquality -securetty -selinux sha512 -systemd -yescrypt sys-auth/passwdqc-2.0.2-r1:0 sys-auth/polkit-121:0 duktape -examples -gtk introspection -kde pam -selinux -systemd -test sys-devel/autoconf-2.71-r4:2.71 -emacs sys-devel/autoconf-archive-2022.09.03:0 sys-devel/autoconf-wrapper-20220130:0 sys-devel/automake-1.16.5:1.16 -test sys-devel/automake-wrapper-11-r1:0 sys-devel/bc-1.07.1-r5:0 -libedit readline -static sys-devel/binutils-2.39-r4:2.39 -cet -default-gold -doc -gold -gprofng -multitarget nls -pgo plugins -static-libs -test -vanilla sys-devel/binutils-config-5.4.1:0 native-symlinks sys-devel/bison-3.8.2:0 -examples nls -static -test -verify-sig sys-devel/flex-2.6.4-r4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls -static -test sys-devel/gcc-12.2.1_p20221008:12 -ada -cet -custom-cflags cxx -d -debug -doc -fixed-point fortran -go graphite -hardened -jit -libssp lto multilib nls nptl -objc -objc++ -objc-gc openmp -pch -pgo pie sanitize ssp -systemtap -test -valgrind -vanilla -vtv -zstd sys-devel/gcc-config-2.8:0 cc-wrappers native-symlinks sys-devel/gettext-0.21.1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 acl -cvs cxx -doc -emacs -git -java -java ncurses nls openmp -static-libs -verify-sig sys-devel/gnuconfig-20221007:0 sys-devel/libtool-2.4.7:2 -vanilla sys-devel/m4-1.4.19:0 -examples nls -verify-sig sys-devel/make-4.4:0 -guile nls -static -verify-sig sys-devel/patch-2.7.6-r5:0 -static -test -verify-sig xattr sys-fs/e2fsprogs-1.46.5-r4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cron -fuse -lto nls split-usr -static-libs -test tools sys-fs/udev-init-scripts-35:0 sys-kernel/installkernel-gentoo-5:0 -grub sys-kernel/linux-headers-6.0:0 -headers-only sys-libs/gdbm-1.23:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 berkdb nls readline -static-libs -test -verify-sig sys-libs/glibc-2.36-r5:2.2 -audit -caps -cet -compile-locales -crypt -custom-cflags -doc -gd -hash-sysv-compat -headers-only multiarch multilib -multilib-bootstrap -nscd -profile -selinux ssp stack-realign static-libs -suid -systemd -systemtap -test -vanilla sys-libs/libcap-2.66:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 pam split-usr -static-libs -tools sys-libs/libseccomp-2.5.4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -static-libs -test sys-libs/libxcrypt-4.4.30:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 abi_x86_32 abi_x86_64 -abi_x86_x32 compat -headers-only split-usr -static-libs system -test sys-libs/ncurses-6.3_p20220924:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -ada cxx -debug -doc -gpm -minimal -profile split-usr stack-realign -static-libs -test tinfo -trace -verify-sig sys-libs/pam-1.5.2-r3:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -audit -berkdb -debug filecaps -nis -selinux split-usr sys-libs/readline-8.1_p2-r1:0/8 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 split-usr -static-libs unicode -utils -verify-sig sys-libs/timezone-data-2022f:0 -leaps-timezone nls -zic-slim sys-libs/zlib-1.2.13-r1:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 minizip split-usr -static-libs -verify-sig sys-process/procps-3.3.17-r1:0/8 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 elogind kill -modern-top ncurses nls -selinux split-usr -static-libs -systemd -test unicode sys-process/psmisc-23.5:0 -X nls -selinux -test virtual/acl-0-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs virtual/awk-1:0 virtual/dev-manager-0-r2:0 virtual/editor-0-r3:0 virtual/libc-1-r1:0 virtual/libcrypt-2-r1:0/2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs virtual/libelf-3-r1:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 virtual/libiconv-0-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 virtual/libintl-0-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 virtual/libudev-232-r7:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -systemd virtual/man-0-r4:0 virtual/os-headers-0-r2:0 virtual/package-manager-1:0 virtual/pager-0-r1:0 virtual/perl-Carp-1.520.0-r2:0 virtual/perl-Compress-Raw-Bzip2-2.103.0-r3:0 virtual/perl-Compress-Raw-Zlib-2.202.0:0 virtual/perl-CPAN-2.330.0:0 virtual/perl-CPAN-Meta-2.150.10-r6:0 virtual/perl-CPAN-Meta-Requirements-2.140.0-r8:0 virtual/perl-CPAN-Meta-YAML-0.18.0-r8:0 virtual/perl-Data-Dumper-2.184.0:0 virtual/perl-Digest-MD5-2.580.0-r1:0 virtual/perl-Encode-3.170.0:0 virtual/perl-Exporter-5.770.0:0 virtual/perl-ExtUtils-CBuilder-0.280.236-r1:0 virtual/perl-ExtUtils-Install-2.200.0-r1:0 virtual/perl-ExtUtils-MakeMaker-7.640.0:0 virtual/perl-ExtUtils-Manifest-1.730.0-r1:0 virtual/perl-ExtUtils-ParseXS-3.450.0:0 virtual/perl-File-Path-2.180.0-r1:0 virtual/perl-File-Spec-3.840.0:0 virtual/perl-File-Temp-0.231.100:0 virtual/perl-Getopt-Long-2.520.0-r1:0 virtual/perl-IO-1.500.0:0 virtual/perl-IO-Compress-2.106.0:0 virtual/perl-IO-Socket-IP-0.410.0-r1:0 virtual/perl-JSON-PP-4.70.0:0 virtual/perl-libnet-3.140.0:0 ssl virtual/perl-MIME-Base64-3.160.0-r1:0 virtual/perl-Module-Metadata-1.0.37-r2:0 virtual/perl-parent-0.238.0-r2:0 virtual/perl-Parse-CPAN-Meta-2.150.10-r6:0 virtual/perl-Perl-OSType-1.10.0-r6:0 virtual/perl-podlators-4.140.0-r3:0 virtual/perl-Scalar-List-Utils-1.620.0:0 virtual/perl-Test-Harness-3.440.0:0 virtual/perl-Text-ParseWords-3.310.0:0 virtual/perl-Time-Local-1.300.0-r1:0 virtual/perl-version-0.992.900:0 virtual/perl-XSLoader-0.310.0:0 virtual/pkgconfig-2-r1:0 virtual/python-cffi-1:0 python_targets_pypy3 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 virtual/rust-1.64.0:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -rustfmt virtual/service-manager-1-r1:0 virtual/ssh-0-r1:0 -minimal virtual/tmpfiles-0-r3:0 virtual/ttf-fonts-1-r2:0 virtual/udev-217-r5:0 virtual/w3m-1:0 virtual/yacc-0:0 www-client/pybugz-0.13-r2:0 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 www-client/w3m-0.5.3_p20220429-r1:0 -X -fbcon -gdk-pixbuf -gpm -imlib l10n_ja -lynxkeymap nls -nntp ssl unicode -xface x11-apps/xprop-1.2.5:0 x11-apps/xset-1.2.4-r1:0 x11-base/xcb-proto-1.15.2:0 python_targets_python3_10 -python_targets_python3_11 python_targets_python3_8 python_targets_python3_9 x11-base/xorg-proto-2022.2:0 -test x11-libs/cairo-1.17.6:0 X -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -aqua -debug -gles2-only -gles3 glib -gtk-doc -opengl -test x11-libs/libICE-1.0.10-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 ipv6 x11-libs/libSM-1.2.3-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc ipv6 uuid x11-libs/libX11-1.8.1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -test x11-libs/libXau-1.0.10:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc x11-libs/libxcb-1.15-r1:0/1.12 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -doc -selinux -test xkb x11-libs/libXdmcp-1.1.3-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc x11-libs/libXext-1.3.5:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc x11-libs/libXmu-1.1.4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc x11-libs/libXrender-0.9.11:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 x11-libs/libXt-1.2.1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -test x11-libs/pixman-0.42.2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cpu_flags_arm_iwmmxt -cpu_flags_arm_iwmmxt2 -cpu_flags_arm_neon -cpu_flags_ppc_altivec cpu_flags_x86_mmxext cpu_flags_x86_sse2 cpu_flags_x86_ssse3 -loongson2f -static-libs -test x11-libs/xtrans-1.4.0:0 -doc x11-misc/compose-tables-1.8.1:0 x11-misc/shared-mime-info-2.2:0 -test x11-misc/xdg-utils-1.1.3_p20210805:0 -dbus -doc -gnome ####################### # build.log # ####################### >>> Unpacking source... >>> Unpacking msrest-0.7.1.zip to /var/tmp/portage/dev-python/msrest-0.7.1/work >>> Source unpacked in /var/tmp/portage/dev-python/msrest-0.7.1/work >>> Preparing source in /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1 ... * Build system packages: * dev-python/gpep517 : 9 * dev-python/installer : 0.5.1-r1 * dev-python/setuptools : 65.5.0 * dev-python/setuptools_scm : 7.0.5 * dev-python/wheel : 0.38.1 >>> Source prepared. >>> Configuring source in /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1 ... >>> Source configured. >>> Compiling source in /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1 ... * python3_8: running distutils-r1_run_phase distutils-r1_python_compile * Building the wheel for msrest-0.7.1 via setuptools.build_meta:__legacy__ gpep517 build-wheel --backend setuptools.build_meta:__legacy__ --output-fd 3 --wheel-dir /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_8/wheel 2022-11-05 03:38:10,483 gpep517 INFO Building wheel via backend setuptools.build_meta:__legacy__ running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/msrest copying msrest/__init__.py -> build/lib/msrest copying msrest/service_client.py -> build/lib/msrest copying msrest/async_client.py -> build/lib/msrest copying msrest/version.py -> build/lib/msrest copying msrest/authentication.py -> build/lib/msrest copying msrest/async_paging.py -> build/lib/msrest copying msrest/http_logger.py -> build/lib/msrest copying msrest/paging.py -> build/lib/msrest copying msrest/configuration.py -> build/lib/msrest copying msrest/exceptions.py -> build/lib/msrest copying msrest/serialization.py -> build/lib/msrest creating build/lib/msrest/pipeline copying msrest/pipeline/universal.py -> build/lib/msrest/pipeline copying msrest/pipeline/__init__.py -> build/lib/msrest/pipeline copying msrest/pipeline/async_requests.py -> build/lib/msrest/pipeline copying msrest/pipeline/async_abc.py -> build/lib/msrest/pipeline copying msrest/pipeline/aiohttp.py -> build/lib/msrest/pipeline copying msrest/pipeline/requests.py -> build/lib/msrest/pipeline creating build/lib/msrest/universal_http copying msrest/universal_http/__init__.py -> build/lib/msrest/universal_http copying msrest/universal_http/async_requests.py -> build/lib/msrest/universal_http copying msrest/universal_http/async_abc.py -> build/lib/msrest/universal_http copying msrest/universal_http/aiohttp.py -> build/lib/msrest/universal_http copying msrest/universal_http/requests.py -> build/lib/msrest/universal_http creating build/lib/msrest/polling copying msrest/polling/__init__.py -> build/lib/msrest/polling copying msrest/polling/async_poller.py -> build/lib/msrest/polling copying msrest/polling/poller.py -> build/lib/msrest/polling running egg_info writing msrest.egg-info/PKG-INFO writing dependency_links to msrest.egg-info/dependency_links.txt writing requirements to msrest.egg-info/requires.txt writing top-level names to msrest.egg-info/top_level.txt 2022-11-05 03:38:10,826 setuptools_scm.file_finder_git ERROR listing git files failed - pretending there aren't any reading manifest file 'msrest.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' writing manifest file 'msrest.egg-info/SOURCES.txt' copying msrest/py.typed -> build/lib/msrest warning: build_py: byte-compiling is disabled, skipping. 2022-11-05 03:38:10,861 wheel INFO installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64 creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/py.typed -> build/bdist.linux-x86_64/wheel/msrest creating build/bdist.linux-x86_64/wheel/msrest/polling copying build/lib/msrest/polling/poller.py -> build/bdist.linux-x86_64/wheel/msrest/polling copying build/lib/msrest/polling/async_poller.py -> build/bdist.linux-x86_64/wheel/msrest/polling copying build/lib/msrest/polling/__init__.py -> build/bdist.linux-x86_64/wheel/msrest/polling creating build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/requests.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/aiohttp.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/async_abc.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/async_requests.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/__init__.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http creating build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/requests.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/aiohttp.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/async_abc.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/async_requests.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/__init__.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/universal.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/serialization.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/exceptions.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/configuration.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/paging.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/http_logger.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/async_paging.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/authentication.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/version.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/async_client.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/service_client.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/__init__.py -> build/bdist.linux-x86_64/wheel/msrest warning: install_lib: byte-compiling is disabled, skipping. running install_egg_info Copying msrest.egg-info to build/bdist.linux-x86_64/wheel/msrest-0.7.1-py3.8.egg-info running install_scripts 2022-11-05 03:38:10,938 wheel INFO creating build/bdist.linux-x86_64/wheel/msrest-0.7.1.dist-info/WHEEL 2022-11-05 03:38:10,938 wheel INFO creating '/var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_8/wheel/tmp8_sydxjr/msrest-0.7.1-py3-none-any.whl' and adding 'build/bdist.linux-x86_64/wheel' to it 2022-11-05 03:38:10,939 wheel INFO adding 'msrest/__init__.py' 2022-11-05 03:38:10,939 wheel INFO adding 'msrest/async_client.py' 2022-11-05 03:38:10,940 wheel INFO adding 'msrest/async_paging.py' 2022-11-05 03:38:10,940 wheel INFO adding 'msrest/authentication.py' 2022-11-05 03:38:10,940 wheel INFO adding 'msrest/configuration.py' 2022-11-05 03:38:10,941 wheel INFO adding 'msrest/exceptions.py' 2022-11-05 03:38:10,941 wheel INFO adding 'msrest/http_logger.py' 2022-11-05 03:38:10,942 wheel INFO adding 'msrest/paging.py' 2022-11-05 03:38:10,942 wheel INFO adding 'msrest/py.typed' 2022-11-05 03:38:10,942 wheel INFO adding 'msrest/serialization.py' 2022-11-05 03:38:10,943 wheel INFO adding 'msrest/service_client.py' 2022-11-05 03:38:10,943 wheel INFO adding 'msrest/version.py' 2022-11-05 03:38:10,944 wheel INFO adding 'msrest/pipeline/__init__.py' 2022-11-05 03:38:10,944 wheel INFO adding 'msrest/pipeline/aiohttp.py' 2022-11-05 03:38:10,945 wheel INFO adding 'msrest/pipeline/async_abc.py' 2022-11-05 03:38:10,945 wheel INFO adding 'msrest/pipeline/async_requests.py' 2022-11-05 03:38:10,945 wheel INFO adding 'msrest/pipeline/requests.py' 2022-11-05 03:38:10,946 wheel INFO adding 'msrest/pipeline/universal.py' 2022-11-05 03:38:10,946 wheel INFO adding 'msrest/polling/__init__.py' 2022-11-05 03:38:10,946 wheel INFO adding 'msrest/polling/async_poller.py' 2022-11-05 03:38:10,947 wheel INFO adding 'msrest/polling/poller.py' 2022-11-05 03:38:10,947 wheel INFO adding 'msrest/universal_http/__init__.py' 2022-11-05 03:38:10,948 wheel INFO adding 'msrest/universal_http/aiohttp.py' 2022-11-05 03:38:10,948 wheel INFO adding 'msrest/universal_http/async_abc.py' 2022-11-05 03:38:10,948 wheel INFO adding 'msrest/universal_http/async_requests.py' 2022-11-05 03:38:10,949 wheel INFO adding 'msrest/universal_http/requests.py' 2022-11-05 03:38:10,949 wheel INFO adding 'msrest-0.7.1.dist-info/METADATA' 2022-11-05 03:38:10,950 wheel INFO adding 'msrest-0.7.1.dist-info/WHEEL' 2022-11-05 03:38:10,950 wheel INFO adding 'msrest-0.7.1.dist-info/top_level.txt' 2022-11-05 03:38:10,950 wheel INFO adding 'msrest-0.7.1.dist-info/RECORD' 2022-11-05 03:38:10,950 wheel INFO removing build/bdist.linux-x86_64/wheel 2022-11-05 03:38:10,952 gpep517 INFO The backend produced /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_8/wheel/msrest-0.7.1-py3-none-any.whl * Installing msrest-0.7.1-py3-none-any.whl to /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_8/install gpep517 install-wheel --destdir=/var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_8/install --interpreter=/usr/bin/python3.8 --prefix=/usr --optimize=all /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_8/wheel/msrest-0.7.1-py3-none-any.whl 2022-11-05 03:38:11,500 gpep517 INFO Installing /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_8/wheel/msrest-0.7.1-py3-none-any.whl into /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_8/install 2022-11-05 03:38:11,791 gpep517 INFO Installation complete * python3_9: running distutils-r1_run_phase distutils-r1_python_compile * Building the wheel for msrest-0.7.1 via setuptools.build_meta:__legacy__ gpep517 build-wheel --backend setuptools.build_meta:__legacy__ --output-fd 3 --wheel-dir /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_9/wheel 2022-11-05 03:38:12,703 gpep517 INFO Building wheel via backend setuptools.build_meta:__legacy__ running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/msrest copying msrest/__init__.py -> build/lib/msrest copying msrest/service_client.py -> build/lib/msrest copying msrest/async_client.py -> build/lib/msrest copying msrest/version.py -> build/lib/msrest copying msrest/authentication.py -> build/lib/msrest copying msrest/async_paging.py -> build/lib/msrest copying msrest/http_logger.py -> build/lib/msrest copying msrest/paging.py -> build/lib/msrest copying msrest/configuration.py -> build/lib/msrest copying msrest/exceptions.py -> build/lib/msrest copying msrest/serialization.py -> build/lib/msrest creating build/lib/msrest/pipeline copying msrest/pipeline/universal.py -> build/lib/msrest/pipeline copying msrest/pipeline/__init__.py -> build/lib/msrest/pipeline copying msrest/pipeline/async_requests.py -> build/lib/msrest/pipeline copying msrest/pipeline/async_abc.py -> build/lib/msrest/pipeline copying msrest/pipeline/aiohttp.py -> build/lib/msrest/pipeline copying msrest/pipeline/requests.py -> build/lib/msrest/pipeline creating build/lib/msrest/universal_http copying msrest/universal_http/__init__.py -> build/lib/msrest/universal_http copying msrest/universal_http/async_requests.py -> build/lib/msrest/universal_http copying msrest/universal_http/async_abc.py -> build/lib/msrest/universal_http copying msrest/universal_http/aiohttp.py -> build/lib/msrest/universal_http copying msrest/universal_http/requests.py -> build/lib/msrest/universal_http creating build/lib/msrest/polling copying msrest/polling/__init__.py -> build/lib/msrest/polling copying msrest/polling/async_poller.py -> build/lib/msrest/polling copying msrest/polling/poller.py -> build/lib/msrest/polling running egg_info writing msrest.egg-info/PKG-INFO writing dependency_links to msrest.egg-info/dependency_links.txt writing requirements to msrest.egg-info/requires.txt writing top-level names to msrest.egg-info/top_level.txt 2022-11-05 03:38:13,031 setuptools_scm.file_finder_git ERROR listing git files failed - pretending there aren't any reading manifest file 'msrest.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' writing manifest file 'msrest.egg-info/SOURCES.txt' copying msrest/py.typed -> build/lib/msrest warning: build_py: byte-compiling is disabled, skipping. 2022-11-05 03:38:13,070 wheel INFO installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64 creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/py.typed -> build/bdist.linux-x86_64/wheel/msrest creating build/bdist.linux-x86_64/wheel/msrest/polling copying build/lib/msrest/polling/poller.py -> build/bdist.linux-x86_64/wheel/msrest/polling copying build/lib/msrest/polling/async_poller.py -> build/bdist.linux-x86_64/wheel/msrest/polling copying build/lib/msrest/polling/__init__.py -> build/bdist.linux-x86_64/wheel/msrest/polling creating build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/requests.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/aiohttp.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/async_abc.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/async_requests.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/__init__.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http creating build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/requests.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/aiohttp.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/async_abc.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/async_requests.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/__init__.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/universal.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/serialization.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/exceptions.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/configuration.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/paging.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/http_logger.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/async_paging.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/authentication.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/version.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/async_client.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/service_client.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/__init__.py -> build/bdist.linux-x86_64/wheel/msrest warning: install_lib: byte-compiling is disabled, skipping. running install_egg_info Copying msrest.egg-info to build/bdist.linux-x86_64/wheel/msrest-0.7.1-py3.9.egg-info running install_scripts 2022-11-05 03:38:13,162 wheel INFO creating build/bdist.linux-x86_64/wheel/msrest-0.7.1.dist-info/WHEEL 2022-11-05 03:38:13,163 wheel INFO creating '/var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_9/wheel/tmp071tcbg0/msrest-0.7.1-py3-none-any.whl' and adding 'build/bdist.linux-x86_64/wheel' to it 2022-11-05 03:38:13,164 wheel INFO adding 'msrest/__init__.py' 2022-11-05 03:38:13,164 wheel INFO adding 'msrest/async_client.py' 2022-11-05 03:38:13,165 wheel INFO adding 'msrest/async_paging.py' 2022-11-05 03:38:13,165 wheel INFO adding 'msrest/authentication.py' 2022-11-05 03:38:13,165 wheel INFO adding 'msrest/configuration.py' 2022-11-05 03:38:13,166 wheel INFO adding 'msrest/exceptions.py' 2022-11-05 03:38:13,166 wheel INFO adding 'msrest/http_logger.py' 2022-11-05 03:38:13,167 wheel INFO adding 'msrest/paging.py' 2022-11-05 03:38:13,167 wheel INFO adding 'msrest/py.typed' 2022-11-05 03:38:13,168 wheel INFO adding 'msrest/serialization.py' 2022-11-05 03:38:13,169 wheel INFO adding 'msrest/service_client.py' 2022-11-05 03:38:13,169 wheel INFO adding 'msrest/version.py' 2022-11-05 03:38:13,170 wheel INFO adding 'msrest/pipeline/__init__.py' 2022-11-05 03:38:13,170 wheel INFO adding 'msrest/pipeline/aiohttp.py' 2022-11-05 03:38:13,170 wheel INFO adding 'msrest/pipeline/async_abc.py' 2022-11-05 03:38:13,171 wheel INFO adding 'msrest/pipeline/async_requests.py' 2022-11-05 03:38:13,171 wheel INFO adding 'msrest/pipeline/requests.py' 2022-11-05 03:38:13,172 wheel INFO adding 'msrest/pipeline/universal.py' 2022-11-05 03:38:13,172 wheel INFO adding 'msrest/polling/__init__.py' 2022-11-05 03:38:13,173 wheel INFO adding 'msrest/polling/async_poller.py' 2022-11-05 03:38:13,173 wheel INFO adding 'msrest/polling/poller.py' 2022-11-05 03:38:13,174 wheel INFO adding 'msrest/universal_http/__init__.py' 2022-11-05 03:38:13,174 wheel INFO adding 'msrest/universal_http/aiohttp.py' 2022-11-05 03:38:13,175 wheel INFO adding 'msrest/universal_http/async_abc.py' 2022-11-05 03:38:13,175 wheel INFO adding 'msrest/universal_http/async_requests.py' 2022-11-05 03:38:13,176 wheel INFO adding 'msrest/universal_http/requests.py' 2022-11-05 03:38:13,176 wheel INFO adding 'msrest-0.7.1.dist-info/METADATA' 2022-11-05 03:38:13,177 wheel INFO adding 'msrest-0.7.1.dist-info/WHEEL' 2022-11-05 03:38:13,177 wheel INFO adding 'msrest-0.7.1.dist-info/top_level.txt' 2022-11-05 03:38:13,178 wheel INFO adding 'msrest-0.7.1.dist-info/RECORD' 2022-11-05 03:38:13,178 wheel INFO removing build/bdist.linux-x86_64/wheel 2022-11-05 03:38:13,180 gpep517 INFO The backend produced /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_9/wheel/msrest-0.7.1-py3-none-any.whl * Installing msrest-0.7.1-py3-none-any.whl to /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_9/install gpep517 install-wheel --destdir=/var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_9/install --interpreter=/usr/bin/python3.9 --prefix=/usr --optimize=all /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_9/wheel/msrest-0.7.1-py3-none-any.whl 2022-11-05 03:38:13,703 gpep517 INFO Installing /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_9/wheel/msrest-0.7.1-py3-none-any.whl into /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_9/install 2022-11-05 03:38:13,951 gpep517 INFO Installation complete * python3_10: running distutils-r1_run_phase distutils-r1_python_compile * Building the wheel for msrest-0.7.1 via setuptools.build_meta:__legacy__ gpep517 build-wheel --backend setuptools.build_meta:__legacy__ --output-fd 3 --wheel-dir /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_10/wheel 2022-11-05 03:38:14,871 gpep517 INFO Building wheel via backend setuptools.build_meta:__legacy__ running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/msrest copying msrest/__init__.py -> build/lib/msrest copying msrest/service_client.py -> build/lib/msrest copying msrest/async_client.py -> build/lib/msrest copying msrest/version.py -> build/lib/msrest copying msrest/authentication.py -> build/lib/msrest copying msrest/async_paging.py -> build/lib/msrest copying msrest/http_logger.py -> build/lib/msrest copying msrest/paging.py -> build/lib/msrest copying msrest/configuration.py -> build/lib/msrest copying msrest/exceptions.py -> build/lib/msrest copying msrest/serialization.py -> build/lib/msrest creating build/lib/msrest/pipeline copying msrest/pipeline/universal.py -> build/lib/msrest/pipeline copying msrest/pipeline/__init__.py -> build/lib/msrest/pipeline copying msrest/pipeline/async_requests.py -> build/lib/msrest/pipeline copying msrest/pipeline/async_abc.py -> build/lib/msrest/pipeline copying msrest/pipeline/aiohttp.py -> build/lib/msrest/pipeline copying msrest/pipeline/requests.py -> build/lib/msrest/pipeline creating build/lib/msrest/universal_http copying msrest/universal_http/__init__.py -> build/lib/msrest/universal_http copying msrest/universal_http/async_requests.py -> build/lib/msrest/universal_http copying msrest/universal_http/async_abc.py -> build/lib/msrest/universal_http copying msrest/universal_http/aiohttp.py -> build/lib/msrest/universal_http copying msrest/universal_http/requests.py -> build/lib/msrest/universal_http creating build/lib/msrest/polling copying msrest/polling/__init__.py -> build/lib/msrest/polling copying msrest/polling/async_poller.py -> build/lib/msrest/polling copying msrest/polling/poller.py -> build/lib/msrest/polling running egg_info writing msrest.egg-info/PKG-INFO writing dependency_links to msrest.egg-info/dependency_links.txt writing requirements to msrest.egg-info/requires.txt writing top-level names to msrest.egg-info/top_level.txt 2022-11-05 03:38:15,136 setuptools_scm.file_finder_git ERROR listing git files failed - pretending there aren't any reading manifest file 'msrest.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' writing manifest file 'msrest.egg-info/SOURCES.txt' copying msrest/py.typed -> build/lib/msrest warning: build_py: byte-compiling is disabled, skipping. 2022-11-05 03:38:15,172 wheel INFO installing to build/bdist.linux-x86_64/wheel running install running install_lib creating build/bdist.linux-x86_64 creating build/bdist.linux-x86_64/wheel creating build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/py.typed -> build/bdist.linux-x86_64/wheel/msrest creating build/bdist.linux-x86_64/wheel/msrest/polling copying build/lib/msrest/polling/poller.py -> build/bdist.linux-x86_64/wheel/msrest/polling copying build/lib/msrest/polling/async_poller.py -> build/bdist.linux-x86_64/wheel/msrest/polling copying build/lib/msrest/polling/__init__.py -> build/bdist.linux-x86_64/wheel/msrest/polling creating build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/requests.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/aiohttp.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/async_abc.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/async_requests.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http copying build/lib/msrest/universal_http/__init__.py -> build/bdist.linux-x86_64/wheel/msrest/universal_http creating build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/requests.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/aiohttp.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/async_abc.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/async_requests.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/__init__.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/pipeline/universal.py -> build/bdist.linux-x86_64/wheel/msrest/pipeline copying build/lib/msrest/serialization.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/exceptions.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/configuration.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/paging.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/http_logger.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/async_paging.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/authentication.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/version.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/async_client.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/service_client.py -> build/bdist.linux-x86_64/wheel/msrest copying build/lib/msrest/__init__.py -> build/bdist.linux-x86_64/wheel/msrest warning: install_lib: byte-compiling is disabled, skipping. running install_egg_info Copying msrest.egg-info to build/bdist.linux-x86_64/wheel/msrest-0.7.1-py3.10.egg-info running install_scripts 2022-11-05 03:38:15,261 wheel INFO creating build/bdist.linux-x86_64/wheel/msrest-0.7.1.dist-info/WHEEL 2022-11-05 03:38:15,262 wheel INFO creating '/var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_10/wheel/tmpy8o2phg3/msrest-0.7.1-py3-none-any.whl' and adding 'build/bdist.linux-x86_64/wheel' to it 2022-11-05 03:38:15,263 wheel INFO adding 'msrest/__init__.py' 2022-11-05 03:38:15,263 wheel INFO adding 'msrest/async_client.py' 2022-11-05 03:38:15,263 wheel INFO adding 'msrest/async_paging.py' 2022-11-05 03:38:15,264 wheel INFO adding 'msrest/authentication.py' 2022-11-05 03:38:15,264 wheel INFO adding 'msrest/configuration.py' 2022-11-05 03:38:15,265 wheel INFO adding 'msrest/exceptions.py' 2022-11-05 03:38:15,265 wheel INFO adding 'msrest/http_logger.py' 2022-11-05 03:38:15,266 wheel INFO adding 'msrest/paging.py' 2022-11-05 03:38:15,266 wheel INFO adding 'msrest/py.typed' 2022-11-05 03:38:15,267 wheel INFO adding 'msrest/serialization.py' 2022-11-05 03:38:15,268 wheel INFO adding 'msrest/service_client.py' 2022-11-05 03:38:15,268 wheel INFO adding 'msrest/version.py' 2022-11-05 03:38:15,269 wheel INFO adding 'msrest/pipeline/__init__.py' 2022-11-05 03:38:15,269 wheel INFO adding 'msrest/pipeline/aiohttp.py' 2022-11-05 03:38:15,269 wheel INFO adding 'msrest/pipeline/async_abc.py' 2022-11-05 03:38:15,270 wheel INFO adding 'msrest/pipeline/async_requests.py' 2022-11-05 03:38:15,270 wheel INFO adding 'msrest/pipeline/requests.py' 2022-11-05 03:38:15,271 wheel INFO adding 'msrest/pipeline/universal.py' 2022-11-05 03:38:15,271 wheel INFO adding 'msrest/polling/__init__.py' 2022-11-05 03:38:15,272 wheel INFO adding 'msrest/polling/async_poller.py' 2022-11-05 03:38:15,272 wheel INFO adding 'msrest/polling/poller.py' 2022-11-05 03:38:15,273 wheel INFO adding 'msrest/universal_http/__init__.py' 2022-11-05 03:38:15,273 wheel INFO adding 'msrest/universal_http/aiohttp.py' 2022-11-05 03:38:15,274 wheel INFO adding 'msrest/universal_http/async_abc.py' 2022-11-05 03:38:15,274 wheel INFO adding 'msrest/universal_http/async_requests.py' 2022-11-05 03:38:15,275 wheel INFO adding 'msrest/universal_http/requests.py' 2022-11-05 03:38:15,275 wheel INFO adding 'msrest-0.7.1.dist-info/METADATA' 2022-11-05 03:38:15,276 wheel INFO adding 'msrest-0.7.1.dist-info/WHEEL' 2022-11-05 03:38:15,276 wheel INFO adding 'msrest-0.7.1.dist-info/top_level.txt' 2022-11-05 03:38:15,277 wheel INFO adding 'msrest-0.7.1.dist-info/RECORD' 2022-11-05 03:38:15,277 wheel INFO removing build/bdist.linux-x86_64/wheel 2022-11-05 03:38:15,279 gpep517 INFO The backend produced /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_10/wheel/msrest-0.7.1-py3-none-any.whl * Installing msrest-0.7.1-py3-none-any.whl to /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_10/install gpep517 install-wheel --destdir=/var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_10/install --interpreter=/usr/bin/python3.10 --prefix=/usr --optimize=all /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_10/wheel/msrest-0.7.1-py3-none-any.whl 2022-11-05 03:38:15,834 gpep517 INFO Installing /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_10/wheel/msrest-0.7.1-py3-none-any.whl into /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_10/install 2022-11-05 03:38:16,166 gpep517 INFO Installation complete >>> Source compiled. >>> Test phase: dev-python/msrest-0.7.1 * python3_8: running distutils-r1_run_phase python_test python3.8 -m pytest -vv -ra -l -Wdefault --color=no -o console_output_style=count -p no:cov -p no:flake8 -p no:flakes -p no:pylint -p no:markdown -p no:sugar -p no:xvfb ============================= test session starts ============================== platform linux -- Python 3.8.15, pytest-7.2.0, pluggy-1.0.0 -- /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1-python3_8/install/usr/bin/python3.8 cachedir: .pytest_cache rootdir: /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1, configfile: setup.cfg collecting ... collected 210 items tests/test_auth.py::TestAuthentication::test_apikey_auth PASSED [ 1/210] tests/test_auth.py::TestAuthentication::test_basic_auth PASSED [ 2/210] tests/test_auth.py::TestAuthentication::test_basic_token_auth PASSED [ 3/210] tests/test_auth.py::TestAuthentication::test_cs_auth PASSED [ 4/210] tests/test_auth.py::TestAuthentication::test_eventgrid_auth PASSED [ 5/210] tests/test_auth.py::TestAuthentication::test_eventgrid_domain_auth PASSED [ 6/210] tests/test_auth.py::TestAuthentication::test_token_auth PASSED [ 7/210] tests/test_client.py::TestServiceClient::test_client_formdata_add PASSED [ 8/210] tests/test_client.py::TestServiceClient::test_client_request PASSED [ 9/210] tests/test_client.py::TestServiceClient::test_client_send PASSED [ 10/210] tests/test_client.py::TestServiceClient::test_client_stream_download PASSED [ 11/210] tests/test_client.py::TestServiceClient::test_context_manager PASSED [ 12/210] tests/test_client.py::TestServiceClient::test_deprecated_creds PASSED [ 13/210] tests/test_client.py::TestServiceClient::test_format_data PASSED [ 14/210] tests/test_client.py::TestServiceClient::test_format_url PASSED [ 15/210] tests/test_client.py::TestServiceClient::test_keep_alive PASSED [ 16/210] tests/test_client.py::TestServiceClient::test_request_builder PASSED [ 17/210] tests/test_client.py::TestServiceClient::test_sdk_context_manager PASSED [ 18/210] tests/test_exceptions.py::TestExceptions::test_custom_exception PASSED [ 19/210] tests/test_exceptions.py::TestExceptions::test_request_exception PASSED [ 20/210] tests/test_paging.py::TestPaging::test_advance_paging PASSED [ 21/210] tests/test_paging.py::TestPaging::test_basic_paging PASSED [ 22/210] tests/test_paging.py::TestPaging::test_get_paging PASSED [ 23/210] tests/test_paging.py::TestPaging::test_none_value PASSED [ 24/210] tests/test_paging.py::TestPaging::test_reset_paging PASSED [ 25/210] tests/test_pipeline.py::test_sans_io_exception PASSED [ 26/210] tests/test_pipeline.py::TestClientRequest::test_request_data PASSED [ 27/210] tests/test_pipeline.py::TestClientRequest::test_request_url_with_params PASSED [ 28/210] tests/test_pipeline.py::TestClientRequest::test_request_xml PASSED [ 29/210] tests/test_pipeline.py::TestClientResponse::test_raw_response PASSED [ 30/210] tests/test_polling.py::test_abc_polling PASSED [ 31/210] tests/test_polling.py::test_no_polling PASSED [ 32/210] tests/test_polling.py::test_poller PASSED [ 33/210] tests/test_polling.py::test_broken_poller PASSED [ 34/210] tests/test_requests_universal.py::test_session_callback PASSED [ 35/210] tests/test_requests_universal.py::test_max_retries_on_default_adapter PASSED [ 36/210] tests/test_requests_universal.py::test_threading_basic_requests PASSED [ 37/210] tests/test_requests_universal.py::test_threading_cfg_requests PASSED [ 38/210] tests/test_runtime.py::TestRuntime::test_credential_headers PASSED [ 39/210] tests/test_runtime.py::TestRuntime::test_request_fail PASSED [ 40/210] tests/test_runtime.py::TestRuntime::test_request_proxy PASSED [ 41/210] tests/test_runtime.py::TestRedirect::test_request_redirect_delete PASSED [ 42/210] tests/test_runtime.py::TestRedirect::test_request_redirect_get PASSED [ 43/210] tests/test_runtime.py::TestRedirect::test_request_redirect_head PASSED [ 44/210] tests/test_runtime.py::TestRedirect::test_request_redirect_post PASSED [ 45/210] tests/test_runtime.py::TestRedirect::test_request_redirect_put PASSED [ 46/210] tests/test_runtime.py::TestRuntimeRetry::test_request_retry_3_times PASSED [ 47/210] tests/test_runtime.py::TestRuntimeRetry::test_request_retry_404 PASSED [ 48/210] tests/test_runtime.py::TestRuntimeRetry::test_request_retry_408 PASSED [ 49/210] tests/test_runtime.py::TestRuntimeRetry::test_request_retry_501 PASSED [ 50/210] tests/test_runtime.py::TestRuntimeRetry::test_request_retry_502 PASSED [ 51/210] tests/test_runtime.py::TestRuntimeRetry::test_request_retry_505 PASSED [ 52/210] tests/test_runtime.py::TestRuntimeRetry::test_request_retry_max PASSED [ 53/210] tests/test_serialization.py::TestModelDeserialization::test_empty_enum_logs PASSED [ 54/210] tests/test_serialization.py::TestModelDeserialization::test_model_kwargs PASSED [ 55/210] tests/test_serialization.py::TestModelDeserialization::test_model_kwargs_logs PASSED [ 56/210] tests/test_serialization.py::TestModelDeserialization::test_response PASSED [ 57/210] tests/test_serialization.py::TestRuntimeSerialized::test_additional_properties PASSED [ 58/210] tests/test_serialization.py::TestRuntimeSerialized::test_additional_properties_declared PASSED [ 59/210] tests/test_serialization.py::TestRuntimeSerialized::test_additional_properties_manual PASSED [ 60/210] tests/test_serialization.py::TestRuntimeSerialized::test_additional_properties_no_send PASSED [ 61/210] tests/test_serialization.py::TestRuntimeSerialized::test_additional_properties_with_auto_model PASSED [ 62/210] tests/test_serialization.py::TestRuntimeSerialized::test_attr_bool PASSED [ 63/210] tests/test_serialization.py::TestRuntimeSerialized::test_attr_dict_simple PASSED [ 64/210] tests/test_serialization.py::TestRuntimeSerialized::test_attr_duration PASSED [ 65/210] tests/test_serialization.py::TestRuntimeSerialized::test_attr_enum PASSED [ 66/210] tests/test_serialization.py::TestRuntimeSerialized::test_attr_int PASSED [ 67/210] tests/test_serialization.py::TestRuntimeSerialized::test_attr_list_complex PASSED [ 68/210] tests/test_serialization.py::TestRuntimeSerialized::test_attr_list_simple PASSED [ 69/210] tests/test_serialization.py::TestRuntimeSerialized::test_attr_none PASSED [ 70/210] tests/test_serialization.py::TestRuntimeSerialized::test_attr_sequence PASSED [ 71/210] tests/test_serialization.py::TestRuntimeSerialized::test_attr_str PASSED [ 72/210] tests/test_serialization.py::TestRuntimeSerialized::test_datetime_types_as_type_object PASSED [ 73/210] tests/test_serialization.py::TestRuntimeSerialized::test_decimal_types_as_type_object PASSED [ 74/210] tests/test_serialization.py::TestRuntimeSerialized::test_empty_list PASSED [ 75/210] tests/test_serialization.py::TestRuntimeSerialized::test_json_with_xml_map PASSED [ 76/210] tests/test_serialization.py::TestRuntimeSerialized::test_key_type PASSED [ 77/210] tests/test_serialization.py::TestRuntimeSerialized::test_long_as_type_object PASSED [ 78/210] tests/test_serialization.py::TestRuntimeSerialized::test_model_validate PASSED [ 79/210] tests/test_serialization.py::TestRuntimeSerialized::test_obj_serialize_none PASSED [ 80/210] tests/test_serialization.py::TestRuntimeSerialized::test_obj_with_malformed_map PASSED [ 81/210] tests/test_serialization.py::TestRuntimeSerialized::test_obj_with_mismatched_map PASSED [ 82/210] tests/test_serialization.py::TestRuntimeSerialized::test_polymorphic_serialization PASSED [ 83/210] tests/test_serialization.py::TestRuntimeSerialized::test_serialize_custom_model PASSED [ 84/210] tests/test_serialization.py::TestRuntimeSerialized::test_serialize_datetime PASSED [ 85/210] tests/test_serialization.py::TestRuntimeSerialized::test_serialize_direct_model PASSED [ 86/210] tests/test_serialization.py::TestRuntimeSerialized::test_serialize_empty_iter PASSED [ 87/210] tests/test_serialization.py::TestRuntimeSerialized::test_serialize_from_dict_datetime PASSED [ 88/210] tests/test_serialization.py::TestRuntimeSerialized::test_serialize_int_as_iter_with_div PASSED [ 89/210] tests/test_serialization.py::TestRuntimeSerialized::test_serialize_json_obj PASSED [ 90/210] tests/test_serialization.py::TestRuntimeSerialized::test_serialize_object PASSED [ 91/210] tests/test_serialization.py::TestRuntimeSerialized::test_serialize_primitive_types PASSED [ 92/210] tests/test_serialization.py::TestRuntimeSerialized::test_serialize_query PASSED [ 93/210] tests/test_serialization.py::TestRuntimeSerialized::test_serialize_str_as_iter PASSED [ 94/210] tests/test_serialization.py::TestRuntimeSerialized::test_serialize_time PASSED [ 95/210] tests/test_serialization.py::TestRuntimeSerialized::test_unicode_as_type_object PASSED [ 96/210] tests/test_serialization.py::TestRuntimeSerialized::test_validate PASSED [ 97/210] tests/test_serialization.py::TestRuntimeSerialized::test_validation_flag PASSED [ 98/210] tests/test_serialization.py::TestRuntimeSerialized::test_validation_type PASSED [ 99/210] tests/test_serialization.py::TestRuntimeDeserialized::test_additional_properties PASSED [100/210] tests/test_serialization.py::TestRuntimeDeserialized::test_additional_properties_declared PASSED [101/210] tests/test_serialization.py::TestRuntimeDeserialized::test_additional_properties_flattening PASSED [102/210] tests/test_serialization.py::TestRuntimeDeserialized::test_additional_properties_not_configured PASSED [103/210] tests/test_serialization.py::TestRuntimeDeserialized::test_array_deserialize PASSED [104/210] tests/test_serialization.py::TestRuntimeDeserialized::test_attr_bool PASSED [105/210] tests/test_serialization.py::TestRuntimeDeserialized::test_attr_enum PASSED [106/210] tests/test_serialization.py::TestRuntimeDeserialized::test_attr_int PASSED [107/210] tests/test_serialization.py::TestRuntimeDeserialized::test_attr_list_complex PASSED [108/210] tests/test_serialization.py::TestRuntimeDeserialized::test_attr_list_in_list PASSED [109/210] tests/test_serialization.py::TestRuntimeDeserialized::test_attr_list_simple PASSED [110/210] tests/test_serialization.py::TestRuntimeDeserialized::test_attr_none PASSED [111/210] tests/test_serialization.py::TestRuntimeDeserialized::test_attr_str PASSED [112/210] tests/test_serialization.py::TestRuntimeDeserialized::test_basic_deserialization PASSED [113/210] tests/test_serialization.py::TestRuntimeDeserialized::test_cls_method_deserialization PASSED [114/210] tests/test_serialization.py::TestRuntimeDeserialized::test_deserialize_date PASSED [115/210] tests/test_serialization.py::TestRuntimeDeserialized::test_deserialize_datetime PASSED [116/210] tests/test_serialization.py::TestRuntimeDeserialized::test_deserialize_datetime_rfc PASSED [117/210] tests/test_serialization.py::TestRuntimeDeserialized::test_deserialize_flattening PASSED [118/210] tests/test_serialization.py::TestRuntimeDeserialized::test_deserialize_object PASSED [119/210] tests/test_serialization.py::TestRuntimeDeserialized::test_deserialize_storage PASSED [120/210] tests/test_serialization.py::TestRuntimeDeserialized::test_deserialize_time PASSED [121/210] tests/test_serialization.py::TestRuntimeDeserialized::test_failsafe_deserialization PASSED [122/210] tests/test_serialization.py::TestRuntimeDeserialized::test_invalid_json PASSED [123/210] tests/test_serialization.py::TestRuntimeDeserialized::test_long_as_type_object PASSED [124/210] tests/test_serialization.py::TestRuntimeDeserialized::test_non_obj_deserialization PASSED [125/210] tests/test_serialization.py::TestRuntimeDeserialized::test_obj_with_malformed_map PASSED [126/210] tests/test_serialization.py::TestRuntimeDeserialized::test_obj_with_no_attr PASSED [127/210] tests/test_serialization.py::TestRuntimeDeserialized::test_personalize_deserialization PASSED [128/210] tests/test_serialization.py::TestRuntimeDeserialized::test_polymorphic_deserialization PASSED [129/210] tests/test_serialization.py::TestRuntimeDeserialized::test_polymorphic_deserialization_with_escape PASSED [130/210] tests/test_serialization.py::TestRuntimeDeserialized::test_polymorphic_missing_info PASSED [131/210] tests/test_serialization.py::TestRuntimeDeserialized::test_rfc_pickable PASSED [132/210] tests/test_serialization.py::TestRuntimeDeserialized::test_robust_deserialization PASSED [133/210] tests/test_serialization.py::TestRuntimeDeserialized::test_twice_key_scenario PASSED [134/210] tests/test_serialization.py::TestModelInstanceEquality::test_model_instance_equality PASSED [135/210] tests/test_serialization.py::TestAzureCoreExceptions::test_azure_core_exceptions PASSED [136/210] tests/test_universal_pipeline.py::test_user_agent PASSED [137/210] tests/test_universal_pipeline.py::test_no_log PASSED [138/210] tests/test_universal_pipeline.py::test_raw_deserializer PASSED [139/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_basic PASSED [140/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_basic_unicode PASSED [141/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_basic_text PASSED [142/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_add_prop PASSED [143/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_object PASSED [144/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_object_no_text PASSED [145/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_basic_empty PASSED [146/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_basic_empty_list PASSED [147/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_list_wrapped_items_name_basic_types PASSED [148/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_list_not_wrapped_items_name_basic_types PASSED [149/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_list_wrapped_basic_types PASSED [150/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_list_not_wrapped_basic_types PASSED [151/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_list_wrapped_items_name_complex_types PASSED [152/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_list_not_wrapped_items_name_complex_types PASSED [153/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_list_wrapped_complex_types PASSED [154/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_list_not_wrapped_complex_types PASSED [155/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_basic_additional_properties PASSED [156/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_basic_namespace PASSED [157/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_complex_namespace PASSED [158/210] tests/test_xml_serialization.py::TestXmlDeserialization::test_polymorphic_deserialization PASSED [159/210] tests/test_xml_serialization.py::TestXmlSerialization::test_basic PASSED [160/210] tests/test_xml_serialization.py::TestXmlSerialization::test_basic_unicode PASSED [161/210] tests/test_xml_serialization.py::TestXmlSerialization::test_nested_unicode PASSED [162/210] tests/test_xml_serialization.py::TestXmlSerialization::test_add_prop PASSED [163/210] tests/test_xml_serialization.py::TestXmlSerialization::test_object PASSED [164/210] tests/test_xml_serialization.py::TestXmlSerialization::test_type_basic PASSED [165/210] tests/test_xml_serialization.py::TestXmlSerialization::test_basic_text PASSED [166/210] tests/test_xml_serialization.py::TestXmlSerialization::test_direct_array PASSED [167/210] tests/test_xml_serialization.py::TestXmlSerialization::test_list_wrapped_basic_types PASSED [168/210] tests/test_xml_serialization.py::TestXmlSerialization::test_list_not_wrapped_basic_types PASSED [169/210] tests/test_xml_serialization.py::TestXmlSerialization::test_list_wrapped_items_name_complex_types PASSED [170/210] tests/test_xml_serialization.py::TestXmlSerialization::test_list_not_wrapped_items_name_complex_types PASSED [171/210] tests/test_xml_serialization.py::TestXmlSerialization::test_list_wrapped_complex_types PASSED [172/210] tests/test_xml_serialization.py::TestXmlSerialization::test_list_not_wrapped_complex_types PASSED [173/210] tests/test_xml_serialization.py::TestXmlSerialization::test_two_complex_same_type PASSED [174/210] tests/test_xml_serialization.py::TestXmlSerialization::test_basic_namespace PASSED [175/210] tests/test_xml_serialization.py::TestXmlSerialization::test_basic_is_xml PASSED [176/210] tests/test_xml_serialization.py::TestXmlSerialization::test_basic_unicode_is_xml PASSED [177/210] tests/test_xml_serialization.py::TestXmlSerialization::test_add_prop_is_xml PASSED [178/210] tests/test_xml_serialization.py::TestXmlSerialization::test_object_is_xml PASSED [179/210] tests/test_xml_serialization.py::TestXmlSerialization::test_type_basic_is_xml PASSED [180/210] tests/test_xml_serialization.py::TestXmlSerialization::test_direct_array_is_xml PASSED [181/210] tests/test_xml_serialization.py::TestXmlSerialization::test_list_wrapped_basic_types_is_xml PASSED [182/210] tests/test_xml_serialization.py::TestXmlSerialization::test_list_not_wrapped_basic_types_is_xml PASSED [183/210] tests/test_xml_serialization.py::TestXmlSerialization::test_list_wrapped_items_name_complex_types_is_xml PASSED [184/210] tests/test_xml_serialization.py::TestXmlSerialization::test_list_not_wrapped_items_name_complex_types_is_xml PASSED [185/210] tests/test_xml_serialization.py::TestXmlSerialization::test_list_wrapped_complex_types_is_xml PASSED [186/210] tests/test_xml_serialization.py::TestXmlSerialization::test_list_not_wrapped_complex_types_is_xml PASSED [187/210] tests/test_xml_serialization.py::TestXmlSerialization::test_two_complex_same_type_is_xml PASSED [188/210] tests/test_xml_serialization.py::TestXmlSerialization::test_basic_namespace_is_xml PASSED [189/210] tests/test_xml_serialization.py::TestXmlSerialization::test_complex_namespace PASSED [190/210] tests/asynctests/test_async_client.py::TestServiceClient::test_client_send SKIPPED (async def function and no async plugin installed (see warnings)) [191/210] tests/asynctests/test_async_client.py::TestServiceClient::test_client_stream_download SKIPPED (async def function and no async plugin installed (see warnings)) [192/210] tests/asynctests/test_async_paging.py::TestPaging::test_basic_paging SKIPPED (async def function and no async plugin installed (see warnings)) [193/210] tests/asynctests/test_async_paging.py::TestPaging::test_advance_paging SKIPPED (async def function and no async plugin installed (see warnings)) [194/210] tests/asynctests/test_async_paging.py::TestPaging::test_get_paging SKIPPED (async def function and no async plugin installed (see warnings)) [195/210] tests/asynctests/test_async_paging.py::TestPaging::test_reset_paging SKIPPED (async def function and no async plugin installed (see warnings)) [196/210] tests/asynctests/test_async_paging.py::TestPaging::test_none_value SKIPPED (async def function and no async plugin installed (see warnings)) [197/210] tests/asynctests/test_pipeline.py::test_sans_io_exception SKIPPED (async def function and no async plugin installed (see warnings)) [198/210] tests/asynctests/test_pipeline.py::test_basic_aiohttp SKIPPED (async def function and no async plugin installed (see warnings)) [199/210] tests/asynctests/test_pipeline.py::test_basic_async_requests SKIPPED (async def function and no async plugin installed (see warnings)) [200/210] tests/asynctests/test_pipeline.py::test_conf_async_requests SKIPPED (async def function and no async plugin installed (see warnings)) [201/210] tests/asynctests/test_pipeline.py::test_conf_async_trio_requests FAILED [202/210] tests/asynctests/test_polling.py::test_abc_polling SKIPPED (async def function and no async plugin installed (see warnings)) [203/210] tests/asynctests/test_polling.py::test_no_polling SKIPPED (async def function and no async plugin installed (see warnings)) [204/210] tests/asynctests/test_polling.py::test_poller SKIPPED (async def function and no async plugin installed (see warnings)) [205/210] tests/asynctests/test_polling.py::test_broken_poller SKIPPED (async def function and no async plugin installed (see warnings)) [206/210] tests/asynctests/test_universal_http.py::test_basic_aiohttp SKIPPED (async def function and no async plugin installed (see warnings)) [207/210] tests/asynctests/test_universal_http.py::test_basic_async_requests SKIPPED (async def function and no async plugin installed (see warnings)) [208/210] tests/asynctests/test_universal_http.py::test_conf_async_requests SKIPPED (async def function and no async plugin installed (see warnings)) [209/210] tests/asynctests/test_universal_http.py::test_conf_async_trio_requests FAILED [210/210] =================================== FAILURES =================================== ________________________ test_conf_async_trio_requests _________________________ self = def _new_conn(self): """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ extra_kw = {} if self.source_address: extra_kw["source_address"] = self.source_address if self.socket_options: extra_kw["socket_options"] = self.socket_options try: > conn = connection.create_connection( (self._dns_host, self.port), self.timeout, **extra_kw ) extra_kw = {'socket_options': [(6, 1, 1)]} self = /usr/lib/python3.8/site-packages/urllib3/connection.py:174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('bing.com', 80), timeout = 100, source_address = None socket_options = [(6, 1, 1)] def create_connection( address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT, source_address=None, socket_options=None, ): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: return six.raise_from( LocationParseError(u"'%s', label empty or too long" % host), None ) > for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): address = ('bing.com', 80) err = None family = host = 'bing.com' port = 80 socket_options = [(6, 1, 1)] source_address = None timeout = 100 /usr/lib/python3.8/site-packages/urllib3/util/connection.py:72: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ host = 'bing.com', port = 80, family = type = , proto = 0, flags = 0 def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0): """Resolve host and port into list of address info entries. Translate the host/port argument into a sequence of 5-tuples that contain all the necessary arguments for creating a socket connected to that service. host is a domain name, a string representation of an IPv4/v6 address or None. port is a string service name such as 'http', a numeric port number or None. By passing None as the value of host and port, you can pass NULL to the underlying C API. The family, type and proto arguments can be optionally specified in order to narrow the list of addresses returned. Passing zero as a value for each of these arguments selects the full range of results. """ # We override this function since we want to translate the numeric family # and socket type values to enum constants. addrlist = [] > for res in _socket.getaddrinfo(host, port, family, type, proto, flags): E socket.gaierror: [Errno -3] Temporary failure in name resolution addrlist = [] family = flags = 0 host = 'bing.com' port = 80 proto = 0 type = /usr/lib/python3.8/socket.py:918: gaierror During handling of the above exception, another exception occurred: self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=0, read=3, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=100, read=100, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. > httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) _is_ssl_error_message_from_http_proxy = ._is_ssl_error_message_from_http_proxy at 0x7f0e6989a0d0> assert_same_host = False body = None body_pos = None chunked = False clean_exit = False conn = None destination_scheme = None err = None headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} http_tunnel_required = False is_new_proxy_conn = False method = 'GET' parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) pool_timeout = None redirect = False release_conn = False release_this_conn = True response_kw = {'decode_content': False, 'preload_content': False} retries = Retry(total=0, connect=0, read=3, redirect=None, status=None) self = timeout = Timeout(connect=100, read=100, total=None) timeout_obj = Timeout(connect=100, read=100, total=None) url = '/' /usr/lib/python3.8/site-packages/urllib3/connectionpool.py:703: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', timeout = Timeout(connect=100, read=100, total=None) chunked = False httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'}} timeout_obj = Timeout(connect=100, read=100, total=None) def _make_request( self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw ): """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param timeout: Socket timeout in seconds for the request. This can be a float or integer, which will set the same timeout value for the socket connect and the socket read, or an instance of :class:`urllib3.util.Timeout`, which gives you more fine-grained control over your timeouts. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = timeout_obj.connect_timeout # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: # Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout. self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: if chunked: conn.request_chunked(method, url, **httplib_request_kw) else: > conn.request(method, url, **httplib_request_kw) chunked = False conn = httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'}} method = 'GET' self = timeout = Timeout(connect=100, read=100, total=None) timeout_obj = Timeout(connect=100, read=100, total=None) url = '/' /usr/lib/python3.8/site-packages/urllib3/connectionpool.py:398: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} def request(self, method, url, body=None, headers=None): if headers is None: headers = {} else: # Avoid modifying the headers passed into .request() headers = headers.copy() if "user-agent" not in (six.ensure_str(k.lower()) for k in headers): headers["User-Agent"] = _get_default_user_agent() > super(HTTPConnection, self).request(method, url, body=body, headers=headers) __class__ = body = None headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} method = 'GET' self = url = '/' /usr/lib/python3.8/site-packages/urllib3/connection.py:239: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} def request(self, method, url, body=None, headers={}, *, encode_chunked=False): """Send a complete request to the server.""" > self._send_request(method, url, body, headers, encode_chunked) body = None encode_chunked = False headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} method = 'GET' self = url = '/' /usr/lib/python3.8/http/client.py:1256: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} encode_chunked = False def _send_request(self, method, url, body, headers, encode_chunked): # Honor explicitly requested Host: and Accept-Encoding: headers. header_names = frozenset(k.lower() for k in headers) skips = {} if 'host' in header_names: skips['skip_host'] = 1 if 'accept-encoding' in header_names: skips['skip_accept_encoding'] = 1 self.putrequest(method, url, **skips) # chunked encoding will happen if HTTP/1.1 is used and either # the caller passes encode_chunked=True or the following # conditions hold: # 1. content-length has not been explicitly set # 2. the body is a file or iterable, but not a str or bytes-like # 3. Transfer-Encoding has NOT been explicitly set by the caller if 'content-length' not in header_names: # only chunk body if not explicitly set for backwards # compatibility, assuming the client code is already handling the # chunking if 'transfer-encoding' not in header_names: # if content-length cannot be automatically determined, fall # back to chunked encoding encode_chunked = False content_length = self._get_content_length(body, method) if content_length is None: if body is not None: if self.debuglevel > 0: print('Unable to determine size of %r' % body) encode_chunked = True self.putheader('Transfer-Encoding', 'chunked') else: self.putheader('Content-Length', str(content_length)) else: encode_chunked = False for hdr, value in headers.items(): self.putheader(hdr, value) if isinstance(body, str): # RFC 2616 Section 3.7.1 says that text default has a # default charset of iso-8859-1. body = _encode(body, 'body') > self.endheaders(body, encode_chunked=encode_chunked) body = None content_length = None encode_chunked = False hdr = 'Connection' header_names = frozenset({'connection', 'accept', 'accept-encoding', 'user-agent'}) headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} method = 'GET' self = skips = {'skip_accept_encoding': 1} url = '/' value = 'keep-alive' /usr/lib/python3.8/http/client.py:1302: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = message_body = None def endheaders(self, message_body=None, *, encode_chunked=False): """Indicate that the last header line has been sent to the server. This method sends the request to the server. The optional message_body argument can be used to pass a message body associated with the request. """ if self.__state == _CS_REQ_STARTED: self.__state = _CS_REQ_SENT else: raise CannotSendHeader() > self._send_output(message_body, encode_chunked=encode_chunked) encode_chunked = False message_body = None self = /usr/lib/python3.8/http/client.py:1251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = message_body = None, encode_chunked = False def _send_output(self, message_body=None, encode_chunked=False): """Send the currently buffered request and clear the buffer. Appends an extra \\r\\n to the buffer. A message_body may be specified, to be appended to the request. """ self._buffer.extend((b"", b"")) msg = b"\r\n".join(self._buffer) del self._buffer[:] > self.send(msg) encode_chunked = False message_body = None msg = (b'GET / HTTP/1.1\r\nHost: bing.com\r\nUser-Agent: myusergant\r\nAccept-Encod' b'ing: gzip, deflate, br\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n') self = /usr/lib/python3.8/http/client.py:1011: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = data = b'GET / HTTP/1.1\r\nHost: bing.com\r\nUser-Agent: myusergant\r\nAccept-Encoding: gzip, deflate, br\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n' def send(self, data): """Send `data' to the server. ``data`` can be a string object, a bytes object, an array object, a file-like object that supports a .read() method, or an iterable object. """ if self.sock is None: if self.auto_open: > self.connect() data = (b'GET / HTTP/1.1\r\nHost: bing.com\r\nUser-Agent: myusergant\r\nAccept-Encod' b'ing: gzip, deflate, br\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n') self = /usr/lib/python3.8/http/client.py:951: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def connect(self): > conn = self._new_conn() self = /usr/lib/python3.8/site-packages/urllib3/connection.py:205: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self): """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ extra_kw = {} if self.source_address: extra_kw["source_address"] = self.source_address if self.socket_options: extra_kw["socket_options"] = self.socket_options try: conn = connection.create_connection( (self._dns_host, self.port), self.timeout, **extra_kw ) except SocketTimeout: raise ConnectTimeoutError( self, "Connection to %s timed out. (connect timeout=%s)" % (self.host, self.timeout), ) except SocketError as e: > raise NewConnectionError( self, "Failed to establish a new connection: %s" % e ) E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno -3] Temporary failure in name resolution extra_kw = {'socket_options': [(6, 1, 1)]} self = /usr/lib/python3.8/site-packages/urllib3/connection.py:186: NewConnectionError During handling of the above exception, another exception occurred: self = request = , stream = True timeout = Timeout(connect=100, read=100, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection(request.url, proxies) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: if not chunked: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, ) cert = None chunked = False conn = proxies = OrderedDict() request = self = stream = True timeout = Timeout(connect=100, read=100, total=None) url = '/' verify = True /usr/lib/python3.8/site-packages/requests/adapters.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=2, connect=2, read=3, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=100, read=100, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False err = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Pass method to Response for length checking response_kw["request_method"] = method # Import httplib's response into our own wrapper object response = self.ResponseCls.from_httplib( httplib_response, pool=self, connection=response_conn, retries=retries, **response_kw ) # Everything went great! clean_exit = True except EmptyPoolError: # Didn't get a connection from the pool, no need to clean up clean_exit = True release_this_conn = False raise except ( TimeoutError, HTTPException, SocketError, ProtocolError, BaseSSLError, SSLError, CertificateError, ) as e: # Discard the connection for these exceptions. It will be # replaced during the next _get_conn() call. clean_exit = False def _is_ssl_error_message_from_http_proxy(ssl_error): # We're trying to detect the message 'WRONG_VERSION_NUMBER' but # SSLErrors are kinda all over the place when it comes to the message, # so we try to cover our bases here! message = " ".join(re.split("[^a-z]", str(ssl_error).lower())) return ( "wrong version number" in message or "unknown protocol" in message ) # Try to detect a common user error with proxies which is to # set an HTTP proxy to be HTTPS when it should be 'http://' # (ie {'http': 'http://proxy', 'https': 'https://proxy'}) # Instead we add a nice error message and point to a URL. if ( isinstance(e, BaseSSLError) and self.proxy and _is_ssl_error_message_from_http_proxy(e) and conn.proxy and conn.proxy.scheme == "https" ): e = ProxyError( "Your proxy appears to only use HTTP and not HTTPS, " "try changing your proxy URL to be HTTP. See: " "https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html" "#https-proxy-error-http-proxy", SSLError(e), ) elif isinstance(e, (BaseSSLError, CertificateError)): e = SSLError(e) elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy: e = ProxyError("Cannot connect to proxy.", e) elif isinstance(e, (SocketError, HTTPException)): e = ProtocolError("Connection aborted.", e) retries = retries.increment( method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2] ) retries.sleep() # Keep track of the error for the retry warning. err = e finally: if not clean_exit: # We hit some kind of exception, handled or otherwise. We need # to throw the connection away unless explicitly told not to. # Close the connection, set the variable to None, and make sure # we put the None back in the pool to avoid leaking it. conn = conn and conn.close() release_this_conn = True if release_this_conn: # Put the connection back to be reused. If the connection is # expired then it will be None, which will get replaced with a # fresh connection during _get_conn. self._put_conn(conn) if not conn: # Try again log.warning( "Retrying (%r) after connection broken by '%r': %s", retries, err, url ) > return self.urlopen( method, url, body, headers, retries, redirect, assert_same_host, timeout=timeout, pool_timeout=pool_timeout, release_conn=release_conn, chunked=chunked, body_pos=body_pos, **response_kw ) _is_ssl_error_message_from_http_proxy = ._is_ssl_error_message_from_http_proxy at 0x7f0e6799f550> assert_same_host = False body = None body_pos = None chunked = False clean_exit = False conn = None destination_scheme = None err = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} http_tunnel_required = False is_new_proxy_conn = False method = 'GET' parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) pool_timeout = None redirect = False release_conn = False release_this_conn = True response_kw = {'decode_content': False, 'preload_content': False} retries = Retry(total=2, connect=2, read=3, redirect=None, status=None) self = timeout = Timeout(connect=100, read=100, total=None) timeout_obj = Timeout(connect=100, read=100, total=None) url = '/' /usr/lib/python3.8/site-packages/urllib3/connectionpool.py:815: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=1, connect=1, read=3, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=100, read=100, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False err = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Pass method to Response for length checking response_kw["request_method"] = method # Import httplib's response into our own wrapper object response = self.ResponseCls.from_httplib( httplib_response, pool=self, connection=response_conn, retries=retries, **response_kw ) # Everything went great! clean_exit = True except EmptyPoolError: # Didn't get a connection from the pool, no need to clean up clean_exit = True release_this_conn = False raise except ( TimeoutError, HTTPException, SocketError, ProtocolError, BaseSSLError, SSLError, CertificateError, ) as e: # Discard the connection for these exceptions. It will be # replaced during the next _get_conn() call. clean_exit = False def _is_ssl_error_message_from_http_proxy(ssl_error): # We're trying to detect the message 'WRONG_VERSION_NUMBER' but # SSLErrors are kinda all over the place when it comes to the message, # so we try to cover our bases here! message = " ".join(re.split("[^a-z]", str(ssl_error).lower())) return ( "wrong version number" in message or "unknown protocol" in message ) # Try to detect a common user error with proxies which is to # set an HTTP proxy to be HTTPS when it should be 'http://' # (ie {'http': 'http://proxy', 'https': 'https://proxy'}) # Instead we add a nice error message and point to a URL. if ( isinstance(e, BaseSSLError) and self.proxy and _is_ssl_error_message_from_http_proxy(e) and conn.proxy and conn.proxy.scheme == "https" ): e = ProxyError( "Your proxy appears to only use HTTP and not HTTPS, " "try changing your proxy URL to be HTTP. See: " "https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html" "#https-proxy-error-http-proxy", SSLError(e), ) elif isinstance(e, (BaseSSLError, CertificateError)): e = SSLError(e) elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy: e = ProxyError("Cannot connect to proxy.", e) elif isinstance(e, (SocketError, HTTPException)): e = ProtocolError("Connection aborted.", e) retries = retries.increment( method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2] ) retries.sleep() # Keep track of the error for the retry warning. err = e finally: if not clean_exit: # We hit some kind of exception, handled or otherwise. We need # to throw the connection away unless explicitly told not to. # Close the connection, set the variable to None, and make sure # we put the None back in the pool to avoid leaking it. conn = conn and conn.close() release_this_conn = True if release_this_conn: # Put the connection back to be reused. If the connection is # expired then it will be None, which will get replaced with a # fresh connection during _get_conn. self._put_conn(conn) if not conn: # Try again log.warning( "Retrying (%r) after connection broken by '%r': %s", retries, err, url ) > return self.urlopen( method, url, body, headers, retries, redirect, assert_same_host, timeout=timeout, pool_timeout=pool_timeout, release_conn=release_conn, chunked=chunked, body_pos=body_pos, **response_kw ) _is_ssl_error_message_from_http_proxy = ._is_ssl_error_message_from_http_proxy at 0x7f0e6799f5e0> assert_same_host = False body = None body_pos = None chunked = False clean_exit = False conn = None destination_scheme = None err = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} http_tunnel_required = False is_new_proxy_conn = False method = 'GET' parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) pool_timeout = None redirect = False release_conn = False release_this_conn = True response_kw = {'decode_content': False, 'preload_content': False} retries = Retry(total=1, connect=1, read=3, redirect=None, status=None) self = timeout = Timeout(connect=100, read=100, total=None) timeout_obj = Timeout(connect=100, read=100, total=None) url = '/' /usr/lib/python3.8/site-packages/urllib3/connectionpool.py:815: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=0, read=3, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=100, read=100, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False err = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Pass method to Response for length checking response_kw["request_method"] = method # Import httplib's response into our own wrapper object response = self.ResponseCls.from_httplib( httplib_response, pool=self, connection=response_conn, retries=retries, **response_kw ) # Everything went great! clean_exit = True except EmptyPoolError: # Didn't get a connection from the pool, no need to clean up clean_exit = True release_this_conn = False raise except ( TimeoutError, HTTPException, SocketError, ProtocolError, BaseSSLError, SSLError, CertificateError, ) as e: # Discard the connection for these exceptions. It will be # replaced during the next _get_conn() call. clean_exit = False def _is_ssl_error_message_from_http_proxy(ssl_error): # We're trying to detect the message 'WRONG_VERSION_NUMBER' but # SSLErrors are kinda all over the place when it comes to the message, # so we try to cover our bases here! message = " ".join(re.split("[^a-z]", str(ssl_error).lower())) return ( "wrong version number" in message or "unknown protocol" in message ) # Try to detect a common user error with proxies which is to # set an HTTP proxy to be HTTPS when it should be 'http://' # (ie {'http': 'http://proxy', 'https': 'https://proxy'}) # Instead we add a nice error message and point to a URL. if ( isinstance(e, BaseSSLError) and self.proxy and _is_ssl_error_message_from_http_proxy(e) and conn.proxy and conn.proxy.scheme == "https" ): e = ProxyError( "Your proxy appears to only use HTTP and not HTTPS, " "try changing your proxy URL to be HTTP. See: " "https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html" "#https-proxy-error-http-proxy", SSLError(e), ) elif isinstance(e, (BaseSSLError, CertificateError)): e = SSLError(e) elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy: e = ProxyError("Cannot connect to proxy.", e) elif isinstance(e, (SocketError, HTTPException)): e = ProtocolError("Connection aborted.", e) retries = retries.increment( method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2] ) retries.sleep() # Keep track of the error for the retry warning. err = e finally: if not clean_exit: # We hit some kind of exception, handled or otherwise. We need # to throw the connection away unless explicitly told not to. # Close the connection, set the variable to None, and make sure # we put the None back in the pool to avoid leaking it. conn = conn and conn.close() release_this_conn = True if release_this_conn: # Put the connection back to be reused. If the connection is # expired then it will be None, which will get replaced with a # fresh connection during _get_conn. self._put_conn(conn) if not conn: # Try again log.warning( "Retrying (%r) after connection broken by '%r': %s", retries, err, url ) > return self.urlopen( method, url, body, headers, retries, redirect, assert_same_host, timeout=timeout, pool_timeout=pool_timeout, release_conn=release_conn, chunked=chunked, body_pos=body_pos, **response_kw ) _is_ssl_error_message_from_http_proxy = ._is_ssl_error_message_from_http_proxy at 0x7f0e6799f670> assert_same_host = False body = None body_pos = None chunked = False clean_exit = False conn = None destination_scheme = None err = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} http_tunnel_required = False is_new_proxy_conn = False method = 'GET' parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) pool_timeout = None redirect = False release_conn = False release_this_conn = True response_kw = {'decode_content': False, 'preload_content': False} retries = Retry(total=0, connect=0, read=3, redirect=None, status=None) self = timeout = Timeout(connect=100, read=100, total=None) timeout_obj = Timeout(connect=100, read=100, total=None) url = '/' /usr/lib/python3.8/site-packages/urllib3/connectionpool.py:815: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=0, read=3, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=100, read=100, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Pass method to Response for length checking response_kw["request_method"] = method # Import httplib's response into our own wrapper object response = self.ResponseCls.from_httplib( httplib_response, pool=self, connection=response_conn, retries=retries, **response_kw ) # Everything went great! clean_exit = True except EmptyPoolError: # Didn't get a connection from the pool, no need to clean up clean_exit = True release_this_conn = False raise except ( TimeoutError, HTTPException, SocketError, ProtocolError, BaseSSLError, SSLError, CertificateError, ) as e: # Discard the connection for these exceptions. It will be # replaced during the next _get_conn() call. clean_exit = False def _is_ssl_error_message_from_http_proxy(ssl_error): # We're trying to detect the message 'WRONG_VERSION_NUMBER' but # SSLErrors are kinda all over the place when it comes to the message, # so we try to cover our bases here! message = " ".join(re.split("[^a-z]", str(ssl_error).lower())) return ( "wrong version number" in message or "unknown protocol" in message ) # Try to detect a common user error with proxies which is to # set an HTTP proxy to be HTTPS when it should be 'http://' # (ie {'http': 'http://proxy', 'https': 'https://proxy'}) # Instead we add a nice error message and point to a URL. if ( isinstance(e, BaseSSLError) and self.proxy and _is_ssl_error_message_from_http_proxy(e) and conn.proxy and conn.proxy.scheme == "https" ): e = ProxyError( "Your proxy appears to only use HTTP and not HTTPS, " "try changing your proxy URL to be HTTP. See: " "https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html" "#https-proxy-error-http-proxy", SSLError(e), ) elif isinstance(e, (BaseSSLError, CertificateError)): e = SSLError(e) elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy: e = ProxyError("Cannot connect to proxy.", e) elif isinstance(e, (SocketError, HTTPException)): e = ProtocolError("Connection aborted.", e) > retries = retries.increment( method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2] ) _is_ssl_error_message_from_http_proxy = ._is_ssl_error_message_from_http_proxy at 0x7f0e6989a0d0> assert_same_host = False body = None body_pos = None chunked = False clean_exit = False conn = None destination_scheme = None err = None headers = {'User-Agent': 'myusergant', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} http_tunnel_required = False is_new_proxy_conn = False method = 'GET' parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) pool_timeout = None redirect = False release_conn = False release_this_conn = True response_kw = {'decode_content': False, 'preload_content': False} retries = Retry(total=0, connect=0, read=3, redirect=None, status=None) self = timeout = Timeout(connect=100, read=100, total=None) timeout_obj = Timeout(connect=100, read=100, total=None) url = '/' /usr/lib/python3.8/site-packages/urllib3/connectionpool.py:787: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=0, read=3, redirect=None, status=None) method = 'GET', url = '/', response = None error = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') _pool = _stacktrace = def increment( self, method=None, url=None, response=None, error=None, _pool=None, _stacktrace=None, ): """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.HTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise six.reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise six.reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or not self._is_method_retryable(method): raise six.reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" redirect_location = response.get_redirect_location() status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): > raise MaxRetryError(_pool, url, error or ResponseError(cause)) E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')) _pool = _stacktrace = cause = 'unknown' connect = -1 error = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') history = (RequestHistory(method='GET', url='/', error=NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'), status=None, redirect_location=None), RequestHistory(method='GET', url='/', error=NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'), status=None, redirect_location=None), RequestHistory(method='GET', url='/', error=NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'), status=None, redirect_location=None), RequestHistory(method='GET', url='/', error=NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'), status=None, redirect_location=None)) method = 'GET' new_retry = Retry(total=-1, connect=-1, read=3, redirect=None, status=None) other = None read = 3 redirect = None redirect_location = None response = None self = Retry(total=0, connect=0, read=3, redirect=None, status=None) status = None status_count = None total = -1 url = '/' /usr/lib/python3.8/site-packages/urllib3/util/retry.py:592: MaxRetryError During handling of the above exception, another exception occurred: self = request = kwargs = {'allow_redirects': True, 'cert': None, 'headers': {'User-Agent': 'myusergant'}, 'stream': True, ...} session = trio_limiter = None future = msg = 'Error occurred in request.' async def send(self, request: ClientRequest, **kwargs: Any) -> AsyncClientResponse: # type: ignore """Send the request using this HTTP sender. """ # It's not recommended to provide its own session, and is mostly # to enable some legacy code to plug correctly session = kwargs.pop('session', self.session) trio_limiter = kwargs.get("trio_limiter", None) future = trio.to_thread.run_sync( functools.partial( session.request, request.method, request.url, **kwargs ), limiter=trio_limiter ) try: return TrioAsyncRequestsClientResponse( request, > await future ) future = kwargs = {'allow_redirects': True, 'cert': None, 'headers': {'User-Agent': 'myusergant'}, 'stream': True, 'timeout': 100, 'verify': True} msg = 'Error occurred in request.' request = self = session = trio_limiter = None msrest/universal_http/async_requests.py:224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ sync_fn = functools.partial(>, 'GET', 'http...ng.com/', timeout=100, verify=True, cert=None, allow_redirects=True, headers={'User-Agent': 'myusergant'}, stream=True) cancellable = False limiter = args = (), name = 'trio.to_thread.run_sync-0' @enable_ki_protection async def to_thread_run_sync(sync_fn, *args, cancellable=False, limiter=None): """Convert a blocking operation into an async operation using a thread. These two lines are equivalent:: sync_fn(*args) await trio.to_thread.run_sync(sync_fn, *args) except that if ``sync_fn`` takes a long time, then the first line will block the Trio loop while it runs, while the second line allows other Trio tasks to continue working while ``sync_fn`` runs. This is accomplished by pushing the call to ``sync_fn(*args)`` off into a worker thread. From inside the worker thread, you can get back into Trio using the functions in `trio.from_thread`. Args: sync_fn: An arbitrary synchronous callable. *args: Positional arguments to pass to sync_fn. If you need keyword arguments, use :func:`functools.partial`. cancellable (bool): Whether to allow cancellation of this operation. See discussion below. limiter (None, or CapacityLimiter-like object): An object used to limit the number of simultaneous threads. Most commonly this will be a `~trio.CapacityLimiter`, but it could be anything providing compatible :meth:`~trio.CapacityLimiter.acquire_on_behalf_of` and :meth:`~trio.CapacityLimiter.release_on_behalf_of` methods. This function will call ``acquire_on_behalf_of`` before starting the thread, and ``release_on_behalf_of`` after the thread has finished. If None (the default), uses the default `~trio.CapacityLimiter`, as returned by :func:`current_default_thread_limiter`. **Cancellation handling**: Cancellation is a tricky issue here, because neither Python nor the operating systems it runs on provide any general mechanism for cancelling an arbitrary synchronous function running in a thread. This function will always check for cancellation on entry, before starting the thread. But once the thread is running, there are two ways it can handle being cancelled: * If ``cancellable=False``, the function ignores the cancellation and keeps going, just like if we had called ``sync_fn`` synchronously. This is the default behavior. * If ``cancellable=True``, then this function immediately raises `~trio.Cancelled`. In this case **the thread keeps running in background** – we just abandon it to do whatever it's going to do, and silently discard any return value or errors that it raises. Only use this if you know that the operation is safe and side-effect free. (For example: :func:`trio.socket.getaddrinfo` uses a thread with ``cancellable=True``, because it doesn't really affect anything if a stray hostname lookup keeps running in the background.) The ``limiter`` is only released after the thread has *actually* finished – which in the case of cancellation may be some time after this function has returned. If :func:`trio.run` finishes before the thread does, then the limiter release method will never be called at all. .. warning:: You should not use this function to call long-running CPU-bound functions! In addition to the usual GIL-related reasons why using threads for CPU-bound work is not very effective in Python, there is an additional problem: on CPython, `CPU-bound threads tend to "starve out" IO-bound threads `__, so using threads for CPU-bound work is likely to adversely affect the main thread running Trio. If you need to do this, you're better off using a worker process, or perhaps PyPy (which still has a GIL, but may do a better job of fairly allocating CPU time between threads). Returns: Whatever ``sync_fn(*args)`` returns. Raises: Exception: Whatever ``sync_fn(*args)`` raises. """ await trio.lowlevel.checkpoint_if_cancelled() cancellable = bool(cancellable) # raise early if cancellable.__bool__ raises if limiter is None: limiter = current_default_thread_limiter() # Holds a reference to the task that's blocked in this function waiting # for the result – or None if this function was cancelled and we should # discard the result. task_register = [trio.lowlevel.current_task()] name = f"trio.to_thread.run_sync-{next(_thread_counter)}" placeholder = ThreadPlaceholder(name) # This function gets scheduled into the Trio run loop to deliver the # thread's result. def report_back_in_trio_thread_fn(result): def do_release_then_return_result(): # release_on_behalf_of is an arbitrary user-defined method, so it # might raise an error. If it does, we want that error to # replace the regular return value, and if the regular return was # already an exception then we want them to chain. try: return result.unwrap() finally: limiter.release_on_behalf_of(placeholder) result = outcome.capture(do_release_then_return_result) if task_register[0] is not None: trio.lowlevel.reschedule(task_register[0], result) current_trio_token = trio.lowlevel.current_trio_token() def worker_fn(): current_async_library_cvar.set(None) TOKEN_LOCAL.token = current_trio_token try: ret = sync_fn(*args) if inspect.iscoroutine(ret): # Manually close coroutine to avoid RuntimeWarnings ret.close() raise TypeError( "Trio expected a sync function, but {!r} appears to be " "asynchronous".format(getattr(sync_fn, "__qualname__", sync_fn)) ) return ret finally: del TOKEN_LOCAL.token context = contextvars.copy_context() contextvars_aware_worker_fn = functools.partial(context.run, worker_fn) def deliver_worker_fn_result(result): try: current_trio_token.run_sync_soon(report_back_in_trio_thread_fn, result) except trio.RunFinishedError: # The entire run finished, so the task we're trying to contact is # certainly long gone -- it must have been cancelled and abandoned # us. pass await limiter.acquire_on_behalf_of(placeholder) try: start_thread_soon(contextvars_aware_worker_fn, deliver_worker_fn_result) except: limiter.release_on_behalf_of(placeholder) raise def abort(_): if cancellable: task_register[0] = None return trio.lowlevel.Abort.SUCCEEDED else: return trio.lowlevel.Abort.FAILED > return await trio.lowlevel.wait_task_rescheduled(abort) abort = .abort at 0x7f0e6799f4c0> args = () cancellable = False context = contextvars_aware_worker_fn = functools.partial(, .worker_fn at 0x7f0e6799f0d0>) current_trio_token = TrioToken(_reentry_queue=EntryQueue(queue=deque([]), idempotent_queue={}, wakeup=, done=True, lock=)) deliver_worker_fn_result = .deliver_worker_fn_result at 0x7f0e6799f1f0> limiter = name = 'trio.to_thread.run_sync-0' placeholder = ThreadPlaceholder(name='trio.to_thread.run_sync-0') report_back_in_trio_thread_fn = .report_back_in_trio_thread_fn at 0x7f0e6799f280> sync_fn = functools.partial(>, 'GET', 'http://bing.com/', timeout=100, verify=True, cert=None, allow_redirects=True, headers={'User-Agent': 'myusergant'}, stream=True) task_register = [.do' at 0x7f0e679f8220>] worker_fn = .worker_fn at 0x7f0e6799f0d0> /usr/lib/python3.8/site-packages/trio/_threads.py:215: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ abort_func = .abort at 0x7f0e6799f4c0> async def wait_task_rescheduled(abort_func): """Put the current task to sleep, with cancellation support. This is the lowest-level API for blocking in Trio. Every time a :class:`~trio.lowlevel.Task` blocks, it does so by calling this function (usually indirectly via some higher-level API). This is a tricky interface with no guard rails. If you can use :class:`ParkingLot` or the built-in I/O wait functions instead, then you should. Generally the way it works is that before calling this function, you make arrangements for "someone" to call :func:`reschedule` on the current task at some later point. Then you call :func:`wait_task_rescheduled`, passing in ``abort_func``, an "abort callback". (Terminology: in Trio, "aborting" is the process of attempting to interrupt a blocked task to deliver a cancellation.) There are two possibilities for what happens next: 1. "Someone" calls :func:`reschedule` on the current task, and :func:`wait_task_rescheduled` returns or raises whatever value or error was passed to :func:`reschedule`. 2. The call's context transitions to a cancelled state (e.g. due to a timeout expiring). When this happens, the ``abort_func`` is called. Its interface looks like:: def abort_func(raise_cancel): ... return trio.lowlevel.Abort.SUCCEEDED # or FAILED It should attempt to clean up any state associated with this call, and in particular, arrange that :func:`reschedule` will *not* be called later. If (and only if!) it is successful, then it should return :data:`Abort.SUCCEEDED`, in which case the task will automatically be rescheduled with an appropriate :exc:`~trio.Cancelled` error. Otherwise, it should return :data:`Abort.FAILED`. This means that the task can't be cancelled at this time, and still has to make sure that "someone" eventually calls :func:`reschedule`. At that point there are again two possibilities. You can simply ignore the cancellation altogether: wait for the operation to complete and then reschedule and continue as normal. (For example, this is what :func:`trio.to_thread.run_sync` does if cancellation is disabled.) The other possibility is that the ``abort_func`` does succeed in cancelling the operation, but for some reason isn't able to report that right away. (Example: on Windows, it's possible to request that an async ("overlapped") I/O operation be cancelled, but this request is *also* asynchronous – you don't find out until later whether the operation was actually cancelled or not.) To report a delayed cancellation, then you should reschedule the task yourself, and call the ``raise_cancel`` callback passed to ``abort_func`` to raise a :exc:`~trio.Cancelled` (or possibly :exc:`KeyboardInterrupt`) exception into this task. Either of the approaches sketched below can work:: # Option 1: # Catch the exception from raise_cancel and inject it into the task. # (This is what Trio does automatically for you if you return # Abort.SUCCEEDED.) trio.lowlevel.reschedule(task, outcome.capture(raise_cancel)) # Option 2: # wait to be woken by "someone", and then decide whether to raise # the error from inside the task. outer_raise_cancel = None def abort(inner_raise_cancel): nonlocal outer_raise_cancel outer_raise_cancel = inner_raise_cancel TRY_TO_CANCEL_OPERATION() return trio.lowlevel.Abort.FAILED await wait_task_rescheduled(abort) if OPERATION_WAS_SUCCESSFULLY_CANCELLED: # raises the error outer_raise_cancel() In any case it's guaranteed that we only call the ``abort_func`` at most once per call to :func:`wait_task_rescheduled`. Sometimes, it's useful to be able to share some mutable sleep-related data between the sleeping task, the abort function, and the waking task. You can use the sleeping task's :data:`~Task.custom_sleep_data` attribute to store this data, and Trio won't touch it, except to make sure that it gets cleared when the task is rescheduled. .. warning:: If your ``abort_func`` raises an error, or returns any value other than :data:`Abort.SUCCEEDED` or :data:`Abort.FAILED`, then Trio will crash violently. Be careful! Similarly, it is entirely possible to deadlock a Trio program by failing to reschedule a blocked task, or cause havoc by calling :func:`reschedule` too many times. Remember what we said up above about how you should use a higher-level API if at all possible? """ > return (await _async_yield(WaitTaskRescheduled(abort_func))).unwrap() abort_func = .abort at 0x7f0e6799f4c0> /usr/lib/python3.8/site-packages/trio/_core/_traps.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def unwrap(self): self._set_unwrapped() # Tracebacks show the 'raise' line below out of context, so let's give # this variable a name that makes sense out of context. captured_error = self.error try: > raise captured_error /usr/lib/python3.8/site-packages/outcome/_impl.py:138: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_release_then_return_result(): # release_on_behalf_of is an arbitrary user-defined method, so it # might raise an error. If it does, we want that error to # replace the regular return value, and if the regular return was # already an exception then we want them to chain. try: > return result.unwrap() limiter = placeholder = ThreadPlaceholder(name='trio.to_thread.run_sync-0') result = Error(ConnectionError(MaxRetryError("HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))"))) /usr/lib/python3.8/site-packages/trio/_threads.py:161: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def unwrap(self): self._set_unwrapped() # Tracebacks show the 'raise' line below out of context, so let's give # this variable a name that makes sense out of context. captured_error = self.error try: > raise captured_error /usr/lib/python3.8/site-packages/outcome/_impl.py:138: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def worker_fn(): current_async_library_cvar.set(None) TOKEN_LOCAL.token = current_trio_token try: > ret = sync_fn(*args) args = () current_trio_token = TrioToken(_reentry_queue=EntryQueue(queue=deque([]), idempotent_queue={}, wakeup=, done=True, lock=)) sync_fn = functools.partial(>, 'GET', 'http://bing.com/', timeout=100, verify=True, cert=None, allow_redirects=True, headers={'User-Agent': 'myusergant'}, stream=True) /usr/lib/python3.8/site-packages/trio/_threads.py:175: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' url = 'http://bing.com/', params = None, data = None headers = {'User-Agent': 'myusergant'}, cookies = None, files = None auth = None, timeout = 100, allow_redirects = True, proxies = {}, hooks = None stream = True, verify = True, cert = None, json = None def request( self, method, url, params=None, data=None, headers=None, cookies=None, files=None, auth=None, timeout=None, allow_redirects=True, proxies=None, hooks=None, stream=None, verify=None, cert=None, json=None, ): """Constructs a :class:`Request `, prepares it and sends it. Returns :class:`Response ` object. :param method: method for the new :class:`Request` object. :param url: URL for the new :class:`Request` object. :param params: (optional) Dictionary or bytes to be sent in the query string for the :class:`Request`. :param data: (optional) Dictionary, list of tuples, bytes, or file-like object to send in the body of the :class:`Request`. :param json: (optional) json to send in the body of the :class:`Request`. :param headers: (optional) Dictionary of HTTP Headers to send with the :class:`Request`. :param cookies: (optional) Dict or CookieJar object to send with the :class:`Request`. :param files: (optional) Dictionary of ``'filename': file-like-objects`` for multipart encoding upload. :param auth: (optional) Auth tuple or callable to enable Basic/Digest/Custom HTTP Auth. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple :param allow_redirects: (optional) Set to True by default. :type allow_redirects: bool :param proxies: (optional) Dictionary mapping protocol or protocol and hostname to the URL of the proxy. :param stream: (optional) whether to immediately download the response content. Defaults to ``False``. :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use. Defaults to ``True``. When set to ``False``, requests will accept any TLS certificate presented by the server, and will ignore hostname mismatches and/or expired certificates, which will make your application vulnerable to man-in-the-middle (MitM) attacks. Setting verify to ``False`` may be useful during local development or testing. :param cert: (optional) if String, path to ssl client cert file (.pem). If Tuple, ('cert', 'key') pair. :rtype: requests.Response """ # Create the Request. req = Request( method=method.upper(), url=url, headers=headers, files=files, data=data or {}, json=json, params=params or {}, auth=auth, cookies=cookies, hooks=hooks, ) prep = self.prepare_request(req) proxies = proxies or {} settings = self.merge_environment_settings( prep.url, proxies, stream, verify, cert ) # Send the request. send_kwargs = { "timeout": timeout, "allow_redirects": allow_redirects, } send_kwargs.update(settings) > resp = self.send(prep, **send_kwargs) allow_redirects = True auth = None cert = None cookies = None data = None files = None headers = {'User-Agent': 'myusergant'} hooks = None json = None method = 'GET' params = None prep = proxies = {} req = self = send_kwargs = {'allow_redirects': True, 'cert': None, 'proxies': OrderedDict(), 'stream': True, 'timeout': 100, 'verify': True} settings = {'cert': None, 'proxies': OrderedDict(), 'stream': True, 'verify': True} stream = True timeout = 100 url = 'http://bing.com/' verify = True /usr/lib/python3.8/site-packages/requests/sessions.py:587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = kwargs = {'cert': None, 'proxies': OrderedDict(), 'stream': True, 'timeout': 100, ...} allow_redirects = True, stream = True, hooks = {'response': []} adapter = start = 1667615901.8270504 def send(self, request, **kwargs): """Send a given PreparedRequest. :rtype: requests.Response """ # Set defaults that the hooks can utilize to ensure they always have # the correct parameters to reproduce the previous request. kwargs.setdefault("stream", self.stream) kwargs.setdefault("verify", self.verify) kwargs.setdefault("cert", self.cert) if "proxies" not in kwargs: kwargs["proxies"] = resolve_proxies(request, self.proxies, self.trust_env) # It's possible that users might accidentally send a Request object. # Guard against that specific failure case. if isinstance(request, Request): raise ValueError("You can only send PreparedRequests.") # Set up variables needed for resolve_redirects and dispatching of hooks allow_redirects = kwargs.pop("allow_redirects", True) stream = kwargs.get("stream") hooks = request.hooks # Get the appropriate adapter to use adapter = self.get_adapter(url=request.url) # Start time (approximately) of the request start = preferred_clock() # Send the request > r = adapter.send(request, **kwargs) adapter = allow_redirects = True hooks = {'response': []} kwargs = {'cert': None, 'proxies': OrderedDict(), 'stream': True, 'timeout': 100, 'verify': True} request = self = start = 1667615901.8270504 stream = True /usr/lib/python3.8/site-packages/requests/sessions.py:701: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = True timeout = Timeout(connect=100, read=100, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection(request.url, proxies) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: if not chunked: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, ) # Send the request. else: if hasattr(conn, "proxy_pool"): conn = conn.proxy_pool low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT) try: skip_host = "Host" in request.headers low_conn.putrequest( request.method, url, skip_accept_encoding=True, skip_host=skip_host, ) for header, value in request.headers.items(): low_conn.putheader(header, value) low_conn.endheaders() for i in request.body: low_conn.send(hex(len(i))[2:].encode("utf-8")) low_conn.send(b"\r\n") low_conn.send(i) low_conn.send(b"\r\n") low_conn.send(b"0\r\n\r\n") # Receive the response from the server r = low_conn.getresponse() resp = HTTPResponse.from_httplib( r, pool=conn, connection=low_conn, preload_content=False, decode_content=False, ) except Exception: # If we hit any problems here, clean up the connection. # Then, raise so that we can handle the actual exception. low_conn.close() raise except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')) cert = None chunked = False conn = proxies = OrderedDict() request = self = stream = True timeout = Timeout(connect=100, read=100, total=None) url = '/' verify = True /usr/lib/python3.8/site-packages/requests/adapters.py:565: ConnectionError During handling of the above exception, another exception occurred: def test_conf_async_trio_requests(): async def do(): conf = Configuration("http://bing.com/") request = ClientRequest("GET", "http://bing.com/") policies = [ UserAgentPolicy("myusergant") ] async with AsyncPipeline(policies, AsyncPipelineRequestsHTTPSender(AsyncTrioRequestsHTTPSender(conf))) as pipeline: return await pipeline.run(request) > response = trio.run(do) do = .do at 0x7f0e6843e9d0> tests/asynctests/test_pipeline.py:127: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/asynctests/test_pipeline.py:125: in do return await pipeline.run(request) conf = pipeline = policies = [] request = msrest/pipeline/async_abc.py:159: in run return await first_node.send(pipeline_request, **kwargs) # type: ignore context = first_node = kwargs = {} pipeline_request = request = self = msrest/pipeline/async_abc.py:79: in send response = await self.next.send(request, **kwargs) # type: ignore kwargs = {} request = self = msrest/pipeline/async_requests.py:85: in send await self.driver.send(request.http_request, **kwargs) kwargs = {} request = self = msrest/universal_http/async_requests.py:236: in send return await super(AsyncTrioRequestsHTTPSender, self).send(request, **requests_kwargs) __class__ = kwargs = {} request = requests_kwargs = {'allow_redirects': True, 'cert': None, 'headers': {'User-Agent': 'myusergant'}, 'stream': True, 'timeout': 100, 'verify': True} self = msrest/universal_http/async_requests.py:228: in send raise_with_traceback(ClientRequestError, msg, err) future = kwargs = {'allow_redirects': True, 'cert': None, 'headers': {'User-Agent': 'myusergant'}, 'stream': True, 'timeout': 100, 'verify': True} msg = 'Error occurred in request.' request = self = session = trio_limiter = None msrest/exceptions.py:51: in raise_with_traceback raise error.with_traceback(exc_traceback) args = (ConnectionError(MaxRetryError("HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))")),) error = ClientRequestError("Error occurred in request., ConnectionError: HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))") exc_msg = ('Error occurred in request., ConnectionError: ' "HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: " "/ (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] ' "Temporary failure in name resolution'))") exc_traceback = exc_type = exc_value = ConnectionError(MaxRetryError("HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))")) exception = kwargs = {} message = 'Error occurred in request.' msrest/universal_http/async_requests.py:224: in send await future future = kwargs = {'allow_redirects': True, 'cert': None, 'headers': {'User-Agent': 'myusergant'}, 'stream': True, 'timeout': 100, 'verify': True} msg = 'Error occurred in request.' request = self = session = trio_limiter = None /usr/lib/python3.8/site-packages/trio/_threads.py:215: in to_thread_run_sync return await trio.lowlevel.wait_task_rescheduled(abort) abort = .abort at 0x7f0e6799f4c0> args = () cancellable = False context = contextvars_aware_worker_fn = functools.partial(, .worker_fn at 0x7f0e6799f0d0>) current_trio_token = TrioToken(_reentry_queue=EntryQueue(queue=deque([]), idempotent_queue={}, wakeup=, done=True, lock=)) deliver_worker_fn_result = .deliver_worker_fn_result at 0x7f0e6799f1f0> limiter = name = 'trio.to_thread.run_sync-0' placeholder = ThreadPlaceholder(name='trio.to_thread.run_sync-0') report_back_in_trio_thread_fn = .report_back_in_trio_thread_fn at 0x7f0e6799f280> sync_fn = functools.partial(>, 'GET', 'http://bing.com/', timeout=100, verify=True, cert=None, allow_redirects=True, headers={'User-Agent': 'myusergant'}, stream=True) task_register = [.do' at 0x7f0e679f8220>] worker_fn = .worker_fn at 0x7f0e6799f0d0> /usr/lib/python3.8/site-packages/trio/_core/_traps.py:166: in wait_task_rescheduled return (await _async_yield(WaitTaskRescheduled(abort_func))).unwrap() abort_func = .abort at 0x7f0e6799f4c0> /usr/lib/python3.8/site-packages/outcome/_impl.py:138: in unwrap raise captured_error /usr/lib/python3.8/site-packages/trio/_threads.py:161: in do_release_then_return_result return result.unwrap() limiter = placeholder = ThreadPlaceholder(name='trio.to_thread.run_sync-0') result = Error(ConnectionError(MaxRetryError("HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))"))) /usr/lib/python3.8/site-packages/outcome/_impl.py:138: in unwrap raise captured_error /usr/lib/python3.8/site-packages/trio/_threads.py:175: in worker_fn ret = sync_fn(*args) args = () current_trio_token = TrioToken(_reentry_queue=EntryQueue(queue=deque([]), idempotent_queue={}, wakeup=, done=True, lock=)) sync_fn = functools.partial(>, 'GET', 'http://bing.com/', timeout=100, verify=True, cert=None, allow_redirects=True, headers={'User-Agent': 'myusergant'}, stream=True) /usr/lib/python3.8/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) allow_redirects = True auth = None cert = None cookies = None data = None files = None headers = {'User-Agent': 'myusergant'} hooks = None json = None method = 'GET' params = None prep = proxies = {} req = self = send_kwargs = {'allow_redirects': True, 'cert': None, 'proxies': OrderedDict(), 'stream': True, 'timeout': 100, 'verify': True} settings = {'cert': None, 'proxies': OrderedDict(), 'stream': True, 'verify': True} stream = True timeout = 100 url = 'http://bing.com/' verify = True /usr/lib/python3.8/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) adapter = allow_redirects = True hooks = {'response': []} kwargs = {'cert': None, 'proxies': OrderedDict(), 'stream': True, 'timeout': 100, 'verify': True} request = self = start = 1667615901.8270504 stream = True _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = True timeout = Timeout(connect=100, read=100, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection(request.url, proxies) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: if not chunked: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, ) # Send the request. else: if hasattr(conn, "proxy_pool"): conn = conn.proxy_pool low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT) try: skip_host = "Host" in request.headers low_conn.putrequest( request.method, url, skip_accept_encoding=True, skip_host=skip_host, ) for header, value in request.headers.items(): low_conn.putheader(header, value) low_conn.endheaders() for i in request.body: low_conn.send(hex(len(i))[2:].encode("utf-8")) low_conn.send(b"\r\n") low_conn.send(i) low_conn.send(b"\r\n") low_conn.send(b"0\r\n\r\n") # Receive the response from the server r = low_conn.getresponse() resp = HTTPResponse.from_httplib( r, pool=conn, connection=low_conn, preload_content=False, decode_content=False, ) except Exception: # If we hit any problems here, clean up the connection. # Then, raise so that we can handle the actual exception. low_conn.close() raise except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E msrest.exceptions.ClientRequestError: Error occurred in request., ConnectionError: HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')) cert = None chunked = False conn = proxies = OrderedDict() request = self = stream = True timeout = Timeout(connect=100, read=100, total=None) url = '/' verify = True /usr/lib/python3.8/site-packages/requests/adapters.py:565: ClientRequestError ------------------------------ Captured log call ------------------------------- WARNING urllib3.connectionpool:connectionpool.py:812 Retrying (Retry(total=2, connect=2, read=3, redirect=None, status=None)) after connection broken by 'NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': / WARNING urllib3.connectionpool:connectionpool.py:812 Retrying (Retry(total=1, connect=1, read=3, redirect=None, status=None)) after connection broken by 'NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': / WARNING urllib3.connectionpool:connectionpool.py:812 Retrying (Retry(total=0, connect=0, read=3, redirect=None, status=None)) after connection broken by 'NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': / ________________________ test_conf_async_trio_requests _________________________ self = def _new_conn(self): """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ extra_kw = {} if self.source_address: extra_kw["source_address"] = self.source_address if self.socket_options: extra_kw["socket_options"] = self.socket_options try: > conn = connection.create_connection( (self._dns_host, self.port), self.timeout, **extra_kw ) extra_kw = {'socket_options': [(6, 1, 1)]} self = /usr/lib/python3.8/site-packages/urllib3/connection.py:174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('bing.com', 80), timeout = 100, source_address = None socket_options = [(6, 1, 1)] def create_connection( address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT, source_address=None, socket_options=None, ): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: return six.raise_from( LocationParseError(u"'%s', label empty or too long" % host), None ) > for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): address = ('bing.com', 80) err = None family = host = 'bing.com' port = 80 socket_options = [(6, 1, 1)] source_address = None timeout = 100 /usr/lib/python3.8/site-packages/urllib3/util/connection.py:72: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ host = 'bing.com', port = 80, family = type = , proto = 0, flags = 0 def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0): """Resolve host and port into list of address info entries. Translate the host/port argument into a sequence of 5-tuples that contain all the necessary arguments for creating a socket connected to that service. host is a domain name, a string representation of an IPv4/v6 address or None. port is a string service name such as 'http', a numeric port number or None. By passing None as the value of host and port, you can pass NULL to the underlying C API. The family, type and proto arguments can be optionally specified in order to narrow the list of addresses returned. Passing zero as a value for each of these arguments selects the full range of results. """ # We override this function since we want to translate the numeric family # and socket type values to enum constants. addrlist = [] > for res in _socket.getaddrinfo(host, port, family, type, proto, flags): E socket.gaierror: [Errno -3] Temporary failure in name resolution addrlist = [] family = flags = 0 host = 'bing.com' port = 80 proto = 0 type = /usr/lib/python3.8/socket.py:918: gaierror During handling of the above exception, another exception occurred: self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=0, read=3, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=100, read=100, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. > httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) _is_ssl_error_message_from_http_proxy = ._is_ssl_error_message_from_http_proxy at 0x7f0e6876e9d0> assert_same_host = False body = None body_pos = None chunked = False clean_exit = False conn = None destination_scheme = None err = None headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} http_tunnel_required = False is_new_proxy_conn = False method = 'GET' parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) pool_timeout = None redirect = False release_conn = False release_this_conn = True response_kw = {'decode_content': False, 'preload_content': False} retries = Retry(total=0, connect=0, read=3, redirect=None, status=None) self = timeout = Timeout(connect=100, read=100, total=None) timeout_obj = Timeout(connect=100, read=100, total=None) url = '/' /usr/lib/python3.8/site-packages/urllib3/connectionpool.py:703: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = conn = method = 'GET', url = '/', timeout = Timeout(connect=100, read=100, total=None) chunked = False httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'}} timeout_obj = Timeout(connect=100, read=100, total=None) def _make_request( self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw ): """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param timeout: Socket timeout in seconds for the request. This can be a float or integer, which will set the same timeout value for the socket connect and the socket read, or an instance of :class:`urllib3.util.Timeout`, which gives you more fine-grained control over your timeouts. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = timeout_obj.connect_timeout # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: # Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout. self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # conn.request() calls http.client.*.request, not the method in # urllib3.request. It also calls makefile (recv) on the socket. try: if chunked: conn.request_chunked(method, url, **httplib_request_kw) else: > conn.request(method, url, **httplib_request_kw) chunked = False conn = httplib_request_kw = {'body': None, 'headers': {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'}} method = 'GET' self = timeout = Timeout(connect=100, read=100, total=None) timeout_obj = Timeout(connect=100, read=100, total=None) url = '/' /usr/lib/python3.8/site-packages/urllib3/connectionpool.py:398: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} def request(self, method, url, body=None, headers=None): if headers is None: headers = {} else: # Avoid modifying the headers passed into .request() headers = headers.copy() if "user-agent" not in (six.ensure_str(k.lower()) for k in headers): headers["User-Agent"] = _get_default_user_agent() > super(HTTPConnection, self).request(method, url, body=body, headers=headers) __class__ = body = None headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} method = 'GET' self = url = '/' /usr/lib/python3.8/site-packages/urllib3/connection.py:239: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} def request(self, method, url, body=None, headers={}, *, encode_chunked=False): """Send a complete request to the server.""" > self._send_request(method, url, body, headers, encode_chunked) body = None encode_chunked = False headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} method = 'GET' self = url = '/' /usr/lib/python3.8/http/client.py:1256: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} encode_chunked = False def _send_request(self, method, url, body, headers, encode_chunked): # Honor explicitly requested Host: and Accept-Encoding: headers. header_names = frozenset(k.lower() for k in headers) skips = {} if 'host' in header_names: skips['skip_host'] = 1 if 'accept-encoding' in header_names: skips['skip_accept_encoding'] = 1 self.putrequest(method, url, **skips) # chunked encoding will happen if HTTP/1.1 is used and either # the caller passes encode_chunked=True or the following # conditions hold: # 1. content-length has not been explicitly set # 2. the body is a file or iterable, but not a str or bytes-like # 3. Transfer-Encoding has NOT been explicitly set by the caller if 'content-length' not in header_names: # only chunk body if not explicitly set for backwards # compatibility, assuming the client code is already handling the # chunking if 'transfer-encoding' not in header_names: # if content-length cannot be automatically determined, fall # back to chunked encoding encode_chunked = False content_length = self._get_content_length(body, method) if content_length is None: if body is not None: if self.debuglevel > 0: print('Unable to determine size of %r' % body) encode_chunked = True self.putheader('Transfer-Encoding', 'chunked') else: self.putheader('Content-Length', str(content_length)) else: encode_chunked = False for hdr, value in headers.items(): self.putheader(hdr, value) if isinstance(body, str): # RFC 2616 Section 3.7.1 says that text default has a # default charset of iso-8859-1. body = _encode(body, 'body') > self.endheaders(body, encode_chunked=encode_chunked) body = None content_length = None encode_chunked = False hdr = 'Connection' header_names = frozenset({'connection', 'accept', 'accept-encoding', 'user-agent'}) headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} method = 'GET' self = skips = {'skip_accept_encoding': 1} url = '/' value = 'keep-alive' /usr/lib/python3.8/http/client.py:1302: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = message_body = None def endheaders(self, message_body=None, *, encode_chunked=False): """Indicate that the last header line has been sent to the server. This method sends the request to the server. The optional message_body argument can be used to pass a message body associated with the request. """ if self.__state == _CS_REQ_STARTED: self.__state = _CS_REQ_SENT else: raise CannotSendHeader() > self._send_output(message_body, encode_chunked=encode_chunked) encode_chunked = False message_body = None self = /usr/lib/python3.8/http/client.py:1251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = message_body = None, encode_chunked = False def _send_output(self, message_body=None, encode_chunked=False): """Send the currently buffered request and clear the buffer. Appends an extra \\r\\n to the buffer. A message_body may be specified, to be appended to the request. """ self._buffer.extend((b"", b"")) msg = b"\r\n".join(self._buffer) del self._buffer[:] > self.send(msg) encode_chunked = False message_body = None msg = (b'GET / HTTP/1.1\r\nHost: bing.com\r\nUser-Agent: python-requests/2.28.1\r\n' b'Accept-Encoding: gzip, deflate, br\r\nAccept: */*\r\nConnection: keep-al' b'ive\r\n\r\n') self = /usr/lib/python3.8/http/client.py:1011: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = data = b'GET / HTTP/1.1\r\nHost: bing.com\r\nUser-Agent: python-requests/2.28.1\r\nAccept-Encoding: gzip, deflate, br\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n' def send(self, data): """Send `data' to the server. ``data`` can be a string object, a bytes object, an array object, a file-like object that supports a .read() method, or an iterable object. """ if self.sock is None: if self.auto_open: > self.connect() data = (b'GET / HTTP/1.1\r\nHost: bing.com\r\nUser-Agent: python-requests/2.28.1\r\n' b'Accept-Encoding: gzip, deflate, br\r\nAccept: */*\r\nConnection: keep-al' b'ive\r\n\r\n') self = /usr/lib/python3.8/http/client.py:951: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def connect(self): > conn = self._new_conn() self = /usr/lib/python3.8/site-packages/urllib3/connection.py:205: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self): """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ extra_kw = {} if self.source_address: extra_kw["source_address"] = self.source_address if self.socket_options: extra_kw["socket_options"] = self.socket_options try: conn = connection.create_connection( (self._dns_host, self.port), self.timeout, **extra_kw ) except SocketTimeout: raise ConnectTimeoutError( self, "Connection to %s timed out. (connect timeout=%s)" % (self.host, self.timeout), ) except SocketError as e: > raise NewConnectionError( self, "Failed to establish a new connection: %s" % e ) E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno -3] Temporary failure in name resolution extra_kw = {'socket_options': [(6, 1, 1)]} self = /usr/lib/python3.8/site-packages/urllib3/connection.py:186: NewConnectionError During handling of the above exception, another exception occurred: self = request = , stream = True timeout = Timeout(connect=100, read=100, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection(request.url, proxies) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: if not chunked: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, ) cert = None chunked = False conn = proxies = OrderedDict() request = self = stream = True timeout = Timeout(connect=100, read=100, total=None) url = '/' verify = True /usr/lib/python3.8/site-packages/requests/adapters.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=2, connect=2, read=3, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=100, read=100, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False err = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Pass method to Response for length checking response_kw["request_method"] = method # Import httplib's response into our own wrapper object response = self.ResponseCls.from_httplib( httplib_response, pool=self, connection=response_conn, retries=retries, **response_kw ) # Everything went great! clean_exit = True except EmptyPoolError: # Didn't get a connection from the pool, no need to clean up clean_exit = True release_this_conn = False raise except ( TimeoutError, HTTPException, SocketError, ProtocolError, BaseSSLError, SSLError, CertificateError, ) as e: # Discard the connection for these exceptions. It will be # replaced during the next _get_conn() call. clean_exit = False def _is_ssl_error_message_from_http_proxy(ssl_error): # We're trying to detect the message 'WRONG_VERSION_NUMBER' but # SSLErrors are kinda all over the place when it comes to the message, # so we try to cover our bases here! message = " ".join(re.split("[^a-z]", str(ssl_error).lower())) return ( "wrong version number" in message or "unknown protocol" in message ) # Try to detect a common user error with proxies which is to # set an HTTP proxy to be HTTPS when it should be 'http://' # (ie {'http': 'http://proxy', 'https': 'https://proxy'}) # Instead we add a nice error message and point to a URL. if ( isinstance(e, BaseSSLError) and self.proxy and _is_ssl_error_message_from_http_proxy(e) and conn.proxy and conn.proxy.scheme == "https" ): e = ProxyError( "Your proxy appears to only use HTTP and not HTTPS, " "try changing your proxy URL to be HTTP. See: " "https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html" "#https-proxy-error-http-proxy", SSLError(e), ) elif isinstance(e, (BaseSSLError, CertificateError)): e = SSLError(e) elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy: e = ProxyError("Cannot connect to proxy.", e) elif isinstance(e, (SocketError, HTTPException)): e = ProtocolError("Connection aborted.", e) retries = retries.increment( method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2] ) retries.sleep() # Keep track of the error for the retry warning. err = e finally: if not clean_exit: # We hit some kind of exception, handled or otherwise. We need # to throw the connection away unless explicitly told not to. # Close the connection, set the variable to None, and make sure # we put the None back in the pool to avoid leaking it. conn = conn and conn.close() release_this_conn = True if release_this_conn: # Put the connection back to be reused. If the connection is # expired then it will be None, which will get replaced with a # fresh connection during _get_conn. self._put_conn(conn) if not conn: # Try again log.warning( "Retrying (%r) after connection broken by '%r': %s", retries, err, url ) > return self.urlopen( method, url, body, headers, retries, redirect, assert_same_host, timeout=timeout, pool_timeout=pool_timeout, release_conn=release_conn, chunked=chunked, body_pos=body_pos, **response_kw ) _is_ssl_error_message_from_http_proxy = ._is_ssl_error_message_from_http_proxy at 0x7f0e679cbaf0> assert_same_host = False body = None body_pos = None chunked = False clean_exit = False conn = None destination_scheme = None err = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} http_tunnel_required = False is_new_proxy_conn = False method = 'GET' parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) pool_timeout = None redirect = False release_conn = False release_this_conn = True response_kw = {'decode_content': False, 'preload_content': False} retries = Retry(total=2, connect=2, read=3, redirect=None, status=None) self = timeout = Timeout(connect=100, read=100, total=None) timeout_obj = Timeout(connect=100, read=100, total=None) url = '/' /usr/lib/python3.8/site-packages/urllib3/connectionpool.py:815: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=1, connect=1, read=3, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=100, read=100, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False err = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Pass method to Response for length checking response_kw["request_method"] = method # Import httplib's response into our own wrapper object response = self.ResponseCls.from_httplib( httplib_response, pool=self, connection=response_conn, retries=retries, **response_kw ) # Everything went great! clean_exit = True except EmptyPoolError: # Didn't get a connection from the pool, no need to clean up clean_exit = True release_this_conn = False raise except ( TimeoutError, HTTPException, SocketError, ProtocolError, BaseSSLError, SSLError, CertificateError, ) as e: # Discard the connection for these exceptions. It will be # replaced during the next _get_conn() call. clean_exit = False def _is_ssl_error_message_from_http_proxy(ssl_error): # We're trying to detect the message 'WRONG_VERSION_NUMBER' but # SSLErrors are kinda all over the place when it comes to the message, # so we try to cover our bases here! message = " ".join(re.split("[^a-z]", str(ssl_error).lower())) return ( "wrong version number" in message or "unknown protocol" in message ) # Try to detect a common user error with proxies which is to # set an HTTP proxy to be HTTPS when it should be 'http://' # (ie {'http': 'http://proxy', 'https': 'https://proxy'}) # Instead we add a nice error message and point to a URL. if ( isinstance(e, BaseSSLError) and self.proxy and _is_ssl_error_message_from_http_proxy(e) and conn.proxy and conn.proxy.scheme == "https" ): e = ProxyError( "Your proxy appears to only use HTTP and not HTTPS, " "try changing your proxy URL to be HTTP. See: " "https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html" "#https-proxy-error-http-proxy", SSLError(e), ) elif isinstance(e, (BaseSSLError, CertificateError)): e = SSLError(e) elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy: e = ProxyError("Cannot connect to proxy.", e) elif isinstance(e, (SocketError, HTTPException)): e = ProtocolError("Connection aborted.", e) retries = retries.increment( method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2] ) retries.sleep() # Keep track of the error for the retry warning. err = e finally: if not clean_exit: # We hit some kind of exception, handled or otherwise. We need # to throw the connection away unless explicitly told not to. # Close the connection, set the variable to None, and make sure # we put the None back in the pool to avoid leaking it. conn = conn and conn.close() release_this_conn = True if release_this_conn: # Put the connection back to be reused. If the connection is # expired then it will be None, which will get replaced with a # fresh connection during _get_conn. self._put_conn(conn) if not conn: # Try again log.warning( "Retrying (%r) after connection broken by '%r': %s", retries, err, url ) > return self.urlopen( method, url, body, headers, retries, redirect, assert_same_host, timeout=timeout, pool_timeout=pool_timeout, release_conn=release_conn, chunked=chunked, body_pos=body_pos, **response_kw ) _is_ssl_error_message_from_http_proxy = ._is_ssl_error_message_from_http_proxy at 0x7f0e67a45b80> assert_same_host = False body = None body_pos = None chunked = False clean_exit = False conn = None destination_scheme = None err = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} http_tunnel_required = False is_new_proxy_conn = False method = 'GET' parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) pool_timeout = None redirect = False release_conn = False release_this_conn = True response_kw = {'decode_content': False, 'preload_content': False} retries = Retry(total=1, connect=1, read=3, redirect=None, status=None) self = timeout = Timeout(connect=100, read=100, total=None) timeout_obj = Timeout(connect=100, read=100, total=None) url = '/' /usr/lib/python3.8/site-packages/urllib3/connectionpool.py:815: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=0, read=3, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=100, read=100, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False err = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Pass method to Response for length checking response_kw["request_method"] = method # Import httplib's response into our own wrapper object response = self.ResponseCls.from_httplib( httplib_response, pool=self, connection=response_conn, retries=retries, **response_kw ) # Everything went great! clean_exit = True except EmptyPoolError: # Didn't get a connection from the pool, no need to clean up clean_exit = True release_this_conn = False raise except ( TimeoutError, HTTPException, SocketError, ProtocolError, BaseSSLError, SSLError, CertificateError, ) as e: # Discard the connection for these exceptions. It will be # replaced during the next _get_conn() call. clean_exit = False def _is_ssl_error_message_from_http_proxy(ssl_error): # We're trying to detect the message 'WRONG_VERSION_NUMBER' but # SSLErrors are kinda all over the place when it comes to the message, # so we try to cover our bases here! message = " ".join(re.split("[^a-z]", str(ssl_error).lower())) return ( "wrong version number" in message or "unknown protocol" in message ) # Try to detect a common user error with proxies which is to # set an HTTP proxy to be HTTPS when it should be 'http://' # (ie {'http': 'http://proxy', 'https': 'https://proxy'}) # Instead we add a nice error message and point to a URL. if ( isinstance(e, BaseSSLError) and self.proxy and _is_ssl_error_message_from_http_proxy(e) and conn.proxy and conn.proxy.scheme == "https" ): e = ProxyError( "Your proxy appears to only use HTTP and not HTTPS, " "try changing your proxy URL to be HTTP. See: " "https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html" "#https-proxy-error-http-proxy", SSLError(e), ) elif isinstance(e, (BaseSSLError, CertificateError)): e = SSLError(e) elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy: e = ProxyError("Cannot connect to proxy.", e) elif isinstance(e, (SocketError, HTTPException)): e = ProtocolError("Connection aborted.", e) retries = retries.increment( method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2] ) retries.sleep() # Keep track of the error for the retry warning. err = e finally: if not clean_exit: # We hit some kind of exception, handled or otherwise. We need # to throw the connection away unless explicitly told not to. # Close the connection, set the variable to None, and make sure # we put the None back in the pool to avoid leaking it. conn = conn and conn.close() release_this_conn = True if release_this_conn: # Put the connection back to be reused. If the connection is # expired then it will be None, which will get replaced with a # fresh connection during _get_conn. self._put_conn(conn) if not conn: # Try again log.warning( "Retrying (%r) after connection broken by '%r': %s", retries, err, url ) > return self.urlopen( method, url, body, headers, retries, redirect, assert_same_host, timeout=timeout, pool_timeout=pool_timeout, release_conn=release_conn, chunked=chunked, body_pos=body_pos, **response_kw ) _is_ssl_error_message_from_http_proxy = ._is_ssl_error_message_from_http_proxy at 0x7f0e67949430> assert_same_host = False body = None body_pos = None chunked = False clean_exit = False conn = None destination_scheme = None err = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} http_tunnel_required = False is_new_proxy_conn = False method = 'GET' parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) pool_timeout = None redirect = False release_conn = False release_this_conn = True response_kw = {'decode_content': False, 'preload_content': False} retries = Retry(total=0, connect=0, read=3, redirect=None, status=None) self = timeout = Timeout(connect=100, read=100, total=None) timeout_obj = Timeout(connect=100, read=100, total=None) url = '/' /usr/lib/python3.8/site-packages/urllib3/connectionpool.py:815: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = method = 'GET', url = '/', body = None headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} retries = Retry(total=0, connect=0, read=3, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=100, read=100, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Pass method to Response for length checking response_kw["request_method"] = method # Import httplib's response into our own wrapper object response = self.ResponseCls.from_httplib( httplib_response, pool=self, connection=response_conn, retries=retries, **response_kw ) # Everything went great! clean_exit = True except EmptyPoolError: # Didn't get a connection from the pool, no need to clean up clean_exit = True release_this_conn = False raise except ( TimeoutError, HTTPException, SocketError, ProtocolError, BaseSSLError, SSLError, CertificateError, ) as e: # Discard the connection for these exceptions. It will be # replaced during the next _get_conn() call. clean_exit = False def _is_ssl_error_message_from_http_proxy(ssl_error): # We're trying to detect the message 'WRONG_VERSION_NUMBER' but # SSLErrors are kinda all over the place when it comes to the message, # so we try to cover our bases here! message = " ".join(re.split("[^a-z]", str(ssl_error).lower())) return ( "wrong version number" in message or "unknown protocol" in message ) # Try to detect a common user error with proxies which is to # set an HTTP proxy to be HTTPS when it should be 'http://' # (ie {'http': 'http://proxy', 'https': 'https://proxy'}) # Instead we add a nice error message and point to a URL. if ( isinstance(e, BaseSSLError) and self.proxy and _is_ssl_error_message_from_http_proxy(e) and conn.proxy and conn.proxy.scheme == "https" ): e = ProxyError( "Your proxy appears to only use HTTP and not HTTPS, " "try changing your proxy URL to be HTTP. See: " "https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html" "#https-proxy-error-http-proxy", SSLError(e), ) elif isinstance(e, (BaseSSLError, CertificateError)): e = SSLError(e) elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy: e = ProxyError("Cannot connect to proxy.", e) elif isinstance(e, (SocketError, HTTPException)): e = ProtocolError("Connection aborted.", e) > retries = retries.increment( method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2] ) _is_ssl_error_message_from_http_proxy = ._is_ssl_error_message_from_http_proxy at 0x7f0e6876e9d0> assert_same_host = False body = None body_pos = None chunked = False clean_exit = False conn = None destination_scheme = None err = None headers = {'User-Agent': 'python-requests/2.28.1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept': '*/*', 'Connection': 'keep-alive'} http_tunnel_required = False is_new_proxy_conn = False method = 'GET' parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) pool_timeout = None redirect = False release_conn = False release_this_conn = True response_kw = {'decode_content': False, 'preload_content': False} retries = Retry(total=0, connect=0, read=3, redirect=None, status=None) self = timeout = Timeout(connect=100, read=100, total=None) timeout_obj = Timeout(connect=100, read=100, total=None) url = '/' /usr/lib/python3.8/site-packages/urllib3/connectionpool.py:787: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=0, read=3, redirect=None, status=None) method = 'GET', url = '/', response = None error = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') _pool = _stacktrace = def increment( self, method=None, url=None, response=None, error=None, _pool=None, _stacktrace=None, ): """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.HTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise six.reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise six.reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or not self._is_method_retryable(method): raise six.reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" redirect_location = response.get_redirect_location() status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): > raise MaxRetryError(_pool, url, error or ResponseError(cause)) E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')) _pool = _stacktrace = cause = 'unknown' connect = -1 error = NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution') history = (RequestHistory(method='GET', url='/', error=NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'), status=None, redirect_location=None), RequestHistory(method='GET', url='/', error=NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'), status=None, redirect_location=None), RequestHistory(method='GET', url='/', error=NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'), status=None, redirect_location=None), RequestHistory(method='GET', url='/', error=NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'), status=None, redirect_location=None)) method = 'GET' new_retry = Retry(total=-1, connect=-1, read=3, redirect=None, status=None) other = None read = 3 redirect = None redirect_location = None response = None self = Retry(total=0, connect=0, read=3, redirect=None, status=None) status = None status_count = None total = -1 url = '/' /usr/lib/python3.8/site-packages/urllib3/util/retry.py:592: MaxRetryError During handling of the above exception, another exception occurred: self = request = kwargs = {'allow_redirects': True, 'cert': None, 'headers': {}, 'stream': True, ...} session = trio_limiter = None future = msg = 'Error occurred in request.' async def send(self, request: ClientRequest, **kwargs: Any) -> AsyncClientResponse: # type: ignore """Send the request using this HTTP sender. """ # It's not recommended to provide its own session, and is mostly # to enable some legacy code to plug correctly session = kwargs.pop('session', self.session) trio_limiter = kwargs.get("trio_limiter", None) future = trio.to_thread.run_sync( functools.partial( session.request, request.method, request.url, **kwargs ), limiter=trio_limiter ) try: return TrioAsyncRequestsClientResponse( request, > await future ) future = kwargs = {'allow_redirects': True, 'cert': None, 'headers': {}, 'stream': True, 'timeout': 100, 'verify': True} msg = 'Error occurred in request.' request = self = session = trio_limiter = None msrest/universal_http/async_requests.py:224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ sync_fn = functools.partial(>, 'GET', 'http://bing.com/', timeout=100, verify=True, cert=None, allow_redirects=True, headers={}, stream=True) cancellable = False limiter = args = (), name = 'trio.to_thread.run_sync-1' @enable_ki_protection async def to_thread_run_sync(sync_fn, *args, cancellable=False, limiter=None): """Convert a blocking operation into an async operation using a thread. These two lines are equivalent:: sync_fn(*args) await trio.to_thread.run_sync(sync_fn, *args) except that if ``sync_fn`` takes a long time, then the first line will block the Trio loop while it runs, while the second line allows other Trio tasks to continue working while ``sync_fn`` runs. This is accomplished by pushing the call to ``sync_fn(*args)`` off into a worker thread. From inside the worker thread, you can get back into Trio using the functions in `trio.from_thread`. Args: sync_fn: An arbitrary synchronous callable. *args: Positional arguments to pass to sync_fn. If you need keyword arguments, use :func:`functools.partial`. cancellable (bool): Whether to allow cancellation of this operation. See discussion below. limiter (None, or CapacityLimiter-like object): An object used to limit the number of simultaneous threads. Most commonly this will be a `~trio.CapacityLimiter`, but it could be anything providing compatible :meth:`~trio.CapacityLimiter.acquire_on_behalf_of` and :meth:`~trio.CapacityLimiter.release_on_behalf_of` methods. This function will call ``acquire_on_behalf_of`` before starting the thread, and ``release_on_behalf_of`` after the thread has finished. If None (the default), uses the default `~trio.CapacityLimiter`, as returned by :func:`current_default_thread_limiter`. **Cancellation handling**: Cancellation is a tricky issue here, because neither Python nor the operating systems it runs on provide any general mechanism for cancelling an arbitrary synchronous function running in a thread. This function will always check for cancellation on entry, before starting the thread. But once the thread is running, there are two ways it can handle being cancelled: * If ``cancellable=False``, the function ignores the cancellation and keeps going, just like if we had called ``sync_fn`` synchronously. This is the default behavior. * If ``cancellable=True``, then this function immediately raises `~trio.Cancelled`. In this case **the thread keeps running in background** – we just abandon it to do whatever it's going to do, and silently discard any return value or errors that it raises. Only use this if you know that the operation is safe and side-effect free. (For example: :func:`trio.socket.getaddrinfo` uses a thread with ``cancellable=True``, because it doesn't really affect anything if a stray hostname lookup keeps running in the background.) The ``limiter`` is only released after the thread has *actually* finished – which in the case of cancellation may be some time after this function has returned. If :func:`trio.run` finishes before the thread does, then the limiter release method will never be called at all. .. warning:: You should not use this function to call long-running CPU-bound functions! In addition to the usual GIL-related reasons why using threads for CPU-bound work is not very effective in Python, there is an additional problem: on CPython, `CPU-bound threads tend to "starve out" IO-bound threads `__, so using threads for CPU-bound work is likely to adversely affect the main thread running Trio. If you need to do this, you're better off using a worker process, or perhaps PyPy (which still has a GIL, but may do a better job of fairly allocating CPU time between threads). Returns: Whatever ``sync_fn(*args)`` returns. Raises: Exception: Whatever ``sync_fn(*args)`` raises. """ await trio.lowlevel.checkpoint_if_cancelled() cancellable = bool(cancellable) # raise early if cancellable.__bool__ raises if limiter is None: limiter = current_default_thread_limiter() # Holds a reference to the task that's blocked in this function waiting # for the result – or None if this function was cancelled and we should # discard the result. task_register = [trio.lowlevel.current_task()] name = f"trio.to_thread.run_sync-{next(_thread_counter)}" placeholder = ThreadPlaceholder(name) # This function gets scheduled into the Trio run loop to deliver the # thread's result. def report_back_in_trio_thread_fn(result): def do_release_then_return_result(): # release_on_behalf_of is an arbitrary user-defined method, so it # might raise an error. If it does, we want that error to # replace the regular return value, and if the regular return was # already an exception then we want them to chain. try: return result.unwrap() finally: limiter.release_on_behalf_of(placeholder) result = outcome.capture(do_release_then_return_result) if task_register[0] is not None: trio.lowlevel.reschedule(task_register[0], result) current_trio_token = trio.lowlevel.current_trio_token() def worker_fn(): current_async_library_cvar.set(None) TOKEN_LOCAL.token = current_trio_token try: ret = sync_fn(*args) if inspect.iscoroutine(ret): # Manually close coroutine to avoid RuntimeWarnings ret.close() raise TypeError( "Trio expected a sync function, but {!r} appears to be " "asynchronous".format(getattr(sync_fn, "__qualname__", sync_fn)) ) return ret finally: del TOKEN_LOCAL.token context = contextvars.copy_context() contextvars_aware_worker_fn = functools.partial(context.run, worker_fn) def deliver_worker_fn_result(result): try: current_trio_token.run_sync_soon(report_back_in_trio_thread_fn, result) except trio.RunFinishedError: # The entire run finished, so the task we're trying to contact is # certainly long gone -- it must have been cancelled and abandoned # us. pass await limiter.acquire_on_behalf_of(placeholder) try: start_thread_soon(contextvars_aware_worker_fn, deliver_worker_fn_result) except: limiter.release_on_behalf_of(placeholder) raise def abort(_): if cancellable: task_register[0] = None return trio.lowlevel.Abort.SUCCEEDED else: return trio.lowlevel.Abort.FAILED > return await trio.lowlevel.wait_task_rescheduled(abort) abort = .abort at 0x7f0e6843eb80> args = () cancellable = False context = contextvars_aware_worker_fn = functools.partial(, .worker_fn at 0x7f0e687101f0>) current_trio_token = TrioToken(_reentry_queue=EntryQueue(queue=deque([]), idempotent_queue={}, wakeup=, done=True, lock=)) deliver_worker_fn_result = .deliver_worker_fn_result at 0x7f0e68710280> limiter = name = 'trio.to_thread.run_sync-1' placeholder = ThreadPlaceholder(name='trio.to_thread.run_sync-1') report_back_in_trio_thread_fn = .report_back_in_trio_thread_fn at 0x7f0e68710310> sync_fn = functools.partial(>, 'GET', 'http://bing.com/', timeout=100, verify=True, cert=None, allow_redirects=True, headers={}, stream=True) task_register = [.do' at 0x7f0e6798b040>] worker_fn = .worker_fn at 0x7f0e687101f0> /usr/lib/python3.8/site-packages/trio/_threads.py:215: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ abort_func = .abort at 0x7f0e6843eb80> async def wait_task_rescheduled(abort_func): """Put the current task to sleep, with cancellation support. This is the lowest-level API for blocking in Trio. Every time a :class:`~trio.lowlevel.Task` blocks, it does so by calling this function (usually indirectly via some higher-level API). This is a tricky interface with no guard rails. If you can use :class:`ParkingLot` or the built-in I/O wait functions instead, then you should. Generally the way it works is that before calling this function, you make arrangements for "someone" to call :func:`reschedule` on the current task at some later point. Then you call :func:`wait_task_rescheduled`, passing in ``abort_func``, an "abort callback". (Terminology: in Trio, "aborting" is the process of attempting to interrupt a blocked task to deliver a cancellation.) There are two possibilities for what happens next: 1. "Someone" calls :func:`reschedule` on the current task, and :func:`wait_task_rescheduled` returns or raises whatever value or error was passed to :func:`reschedule`. 2. The call's context transitions to a cancelled state (e.g. due to a timeout expiring). When this happens, the ``abort_func`` is called. Its interface looks like:: def abort_func(raise_cancel): ... return trio.lowlevel.Abort.SUCCEEDED # or FAILED It should attempt to clean up any state associated with this call, and in particular, arrange that :func:`reschedule` will *not* be called later. If (and only if!) it is successful, then it should return :data:`Abort.SUCCEEDED`, in which case the task will automatically be rescheduled with an appropriate :exc:`~trio.Cancelled` error. Otherwise, it should return :data:`Abort.FAILED`. This means that the task can't be cancelled at this time, and still has to make sure that "someone" eventually calls :func:`reschedule`. At that point there are again two possibilities. You can simply ignore the cancellation altogether: wait for the operation to complete and then reschedule and continue as normal. (For example, this is what :func:`trio.to_thread.run_sync` does if cancellation is disabled.) The other possibility is that the ``abort_func`` does succeed in cancelling the operation, but for some reason isn't able to report that right away. (Example: on Windows, it's possible to request that an async ("overlapped") I/O operation be cancelled, but this request is *also* asynchronous – you don't find out until later whether the operation was actually cancelled or not.) To report a delayed cancellation, then you should reschedule the task yourself, and call the ``raise_cancel`` callback passed to ``abort_func`` to raise a :exc:`~trio.Cancelled` (or possibly :exc:`KeyboardInterrupt`) exception into this task. Either of the approaches sketched below can work:: # Option 1: # Catch the exception from raise_cancel and inject it into the task. # (This is what Trio does automatically for you if you return # Abort.SUCCEEDED.) trio.lowlevel.reschedule(task, outcome.capture(raise_cancel)) # Option 2: # wait to be woken by "someone", and then decide whether to raise # the error from inside the task. outer_raise_cancel = None def abort(inner_raise_cancel): nonlocal outer_raise_cancel outer_raise_cancel = inner_raise_cancel TRY_TO_CANCEL_OPERATION() return trio.lowlevel.Abort.FAILED await wait_task_rescheduled(abort) if OPERATION_WAS_SUCCESSFULLY_CANCELLED: # raises the error outer_raise_cancel() In any case it's guaranteed that we only call the ``abort_func`` at most once per call to :func:`wait_task_rescheduled`. Sometimes, it's useful to be able to share some mutable sleep-related data between the sleeping task, the abort function, and the waking task. You can use the sleeping task's :data:`~Task.custom_sleep_data` attribute to store this data, and Trio won't touch it, except to make sure that it gets cleared when the task is rescheduled. .. warning:: If your ``abort_func`` raises an error, or returns any value other than :data:`Abort.SUCCEEDED` or :data:`Abort.FAILED`, then Trio will crash violently. Be careful! Similarly, it is entirely possible to deadlock a Trio program by failing to reschedule a blocked task, or cause havoc by calling :func:`reschedule` too many times. Remember what we said up above about how you should use a higher-level API if at all possible? """ > return (await _async_yield(WaitTaskRescheduled(abort_func))).unwrap() abort_func = .abort at 0x7f0e6843eb80> /usr/lib/python3.8/site-packages/trio/_core/_traps.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def unwrap(self): self._set_unwrapped() # Tracebacks show the 'raise' line below out of context, so let's give # this variable a name that makes sense out of context. captured_error = self.error try: > raise captured_error /usr/lib/python3.8/site-packages/outcome/_impl.py:138: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_release_then_return_result(): # release_on_behalf_of is an arbitrary user-defined method, so it # might raise an error. If it does, we want that error to # replace the regular return value, and if the regular return was # already an exception then we want them to chain. try: > return result.unwrap() limiter = placeholder = ThreadPlaceholder(name='trio.to_thread.run_sync-1') result = Error(ConnectionError(MaxRetryError("HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))"))) /usr/lib/python3.8/site-packages/trio/_threads.py:161: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def unwrap(self): self._set_unwrapped() # Tracebacks show the 'raise' line below out of context, so let's give # this variable a name that makes sense out of context. captured_error = self.error try: > raise captured_error /usr/lib/python3.8/site-packages/outcome/_impl.py:138: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def worker_fn(): current_async_library_cvar.set(None) TOKEN_LOCAL.token = current_trio_token try: > ret = sync_fn(*args) args = () current_trio_token = TrioToken(_reentry_queue=EntryQueue(queue=deque([]), idempotent_queue={}, wakeup=, done=True, lock=)) sync_fn = functools.partial(>, 'GET', 'http://bing.com/', timeout=100, verify=True, cert=None, allow_redirects=True, headers={}, stream=True) /usr/lib/python3.8/site-packages/trio/_threads.py:175: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' url = 'http://bing.com/', params = None, data = None, headers = {} cookies = None, files = None, auth = None, timeout = 100, allow_redirects = True proxies = {}, hooks = None, stream = True, verify = True, cert = None json = None def request( self, method, url, params=None, data=None, headers=None, cookies=None, files=None, auth=None, timeout=None, allow_redirects=True, proxies=None, hooks=None, stream=None, verify=None, cert=None, json=None, ): """Constructs a :class:`Request `, prepares it and sends it. Returns :class:`Response ` object. :param method: method for the new :class:`Request` object. :param url: URL for the new :class:`Request` object. :param params: (optional) Dictionary or bytes to be sent in the query string for the :class:`Request`. :param data: (optional) Dictionary, list of tuples, bytes, or file-like object to send in the body of the :class:`Request`. :param json: (optional) json to send in the body of the :class:`Request`. :param headers: (optional) Dictionary of HTTP Headers to send with the :class:`Request`. :param cookies: (optional) Dict or CookieJar object to send with the :class:`Request`. :param files: (optional) Dictionary of ``'filename': file-like-objects`` for multipart encoding upload. :param auth: (optional) Auth tuple or callable to enable Basic/Digest/Custom HTTP Auth. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple :param allow_redirects: (optional) Set to True by default. :type allow_redirects: bool :param proxies: (optional) Dictionary mapping protocol or protocol and hostname to the URL of the proxy. :param stream: (optional) whether to immediately download the response content. Defaults to ``False``. :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use. Defaults to ``True``. When set to ``False``, requests will accept any TLS certificate presented by the server, and will ignore hostname mismatches and/or expired certificates, which will make your application vulnerable to man-in-the-middle (MitM) attacks. Setting verify to ``False`` may be useful during local development or testing. :param cert: (optional) if String, path to ssl client cert file (.pem). If Tuple, ('cert', 'key') pair. :rtype: requests.Response """ # Create the Request. req = Request( method=method.upper(), url=url, headers=headers, files=files, data=data or {}, json=json, params=params or {}, auth=auth, cookies=cookies, hooks=hooks, ) prep = self.prepare_request(req) proxies = proxies or {} settings = self.merge_environment_settings( prep.url, proxies, stream, verify, cert ) # Send the request. send_kwargs = { "timeout": timeout, "allow_redirects": allow_redirects, } send_kwargs.update(settings) > resp = self.send(prep, **send_kwargs) allow_redirects = True auth = None cert = None cookies = None data = None files = None headers = {} hooks = None json = None method = 'GET' params = None prep = proxies = {} req = self = send_kwargs = {'allow_redirects': True, 'cert': None, 'proxies': OrderedDict(), 'stream': True, 'timeout': 100, 'verify': True} settings = {'cert': None, 'proxies': OrderedDict(), 'stream': True, 'verify': True} stream = True timeout = 100 url = 'http://bing.com/' verify = True /usr/lib/python3.8/site-packages/requests/sessions.py:587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = kwargs = {'cert': None, 'proxies': OrderedDict(), 'stream': True, 'timeout': 100, ...} allow_redirects = True, stream = True, hooks = {'response': []} adapter = start = 1667615907.3888144 def send(self, request, **kwargs): """Send a given PreparedRequest. :rtype: requests.Response """ # Set defaults that the hooks can utilize to ensure they always have # the correct parameters to reproduce the previous request. kwargs.setdefault("stream", self.stream) kwargs.setdefault("verify", self.verify) kwargs.setdefault("cert", self.cert) if "proxies" not in kwargs: kwargs["proxies"] = resolve_proxies(request, self.proxies, self.trust_env) # It's possible that users might accidentally send a Request object. # Guard against that specific failure case. if isinstance(request, Request): raise ValueError("You can only send PreparedRequests.") # Set up variables needed for resolve_redirects and dispatching of hooks allow_redirects = kwargs.pop("allow_redirects", True) stream = kwargs.get("stream") hooks = request.hooks # Get the appropriate adapter to use adapter = self.get_adapter(url=request.url) # Start time (approximately) of the request start = preferred_clock() # Send the request > r = adapter.send(request, **kwargs) adapter = allow_redirects = True hooks = {'response': []} kwargs = {'cert': None, 'proxies': OrderedDict(), 'stream': True, 'timeout': 100, 'verify': True} request = self = start = 1667615907.3888144 stream = True /usr/lib/python3.8/site-packages/requests/sessions.py:701: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = True timeout = Timeout(connect=100, read=100, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection(request.url, proxies) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: if not chunked: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, ) # Send the request. else: if hasattr(conn, "proxy_pool"): conn = conn.proxy_pool low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT) try: skip_host = "Host" in request.headers low_conn.putrequest( request.method, url, skip_accept_encoding=True, skip_host=skip_host, ) for header, value in request.headers.items(): low_conn.putheader(header, value) low_conn.endheaders() for i in request.body: low_conn.send(hex(len(i))[2:].encode("utf-8")) low_conn.send(b"\r\n") low_conn.send(i) low_conn.send(b"\r\n") low_conn.send(b"0\r\n\r\n") # Receive the response from the server r = low_conn.getresponse() resp = HTTPResponse.from_httplib( r, pool=conn, connection=low_conn, preload_content=False, decode_content=False, ) except Exception: # If we hit any problems here, clean up the connection. # Then, raise so that we can handle the actual exception. low_conn.close() raise except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')) cert = None chunked = False conn = proxies = OrderedDict() request = self = stream = True timeout = Timeout(connect=100, read=100, total=None) url = '/' verify = True /usr/lib/python3.8/site-packages/requests/adapters.py:565: ConnectionError During handling of the above exception, another exception occurred: def test_conf_async_trio_requests(): async def do(): conf = Configuration("http://bing.com/") request = ClientRequest("GET", "http://bing.com/") async with AsyncTrioRequestsHTTPSender(conf) as sender: return await sender.send(request) assert response.body() is not None > response = trio.run(do) do = .do at 0x7f0e68758af0> tests/asynctests/test_universal_http.py:87: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/asynctests/test_universal_http.py:84: in do return await sender.send(request) conf = request = sender = msrest/universal_http/async_requests.py:236: in send return await super(AsyncTrioRequestsHTTPSender, self).send(request, **requests_kwargs) __class__ = kwargs = {} request = requests_kwargs = {'allow_redirects': True, 'cert': None, 'headers': {}, 'stream': True, 'timeout': 100, 'verify': True} self = msrest/universal_http/async_requests.py:228: in send raise_with_traceback(ClientRequestError, msg, err) future = kwargs = {'allow_redirects': True, 'cert': None, 'headers': {}, 'stream': True, 'timeout': 100, 'verify': True} msg = 'Error occurred in request.' request = self = session = trio_limiter = None msrest/exceptions.py:51: in raise_with_traceback raise error.with_traceback(exc_traceback) args = (ConnectionError(MaxRetryError("HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))")),) error = ClientRequestError("Error occurred in request., ConnectionError: HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))") exc_msg = ('Error occurred in request., ConnectionError: ' "HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: " "/ (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] ' "Temporary failure in name resolution'))") exc_traceback = exc_type = exc_value = ConnectionError(MaxRetryError("HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))")) exception = kwargs = {} message = 'Error occurred in request.' msrest/universal_http/async_requests.py:224: in send await future future = kwargs = {'allow_redirects': True, 'cert': None, 'headers': {}, 'stream': True, 'timeout': 100, 'verify': True} msg = 'Error occurred in request.' request = self = session = trio_limiter = None /usr/lib/python3.8/site-packages/trio/_threads.py:215: in to_thread_run_sync return await trio.lowlevel.wait_task_rescheduled(abort) abort = .abort at 0x7f0e6843eb80> args = () cancellable = False context = contextvars_aware_worker_fn = functools.partial(, .worker_fn at 0x7f0e687101f0>) current_trio_token = TrioToken(_reentry_queue=EntryQueue(queue=deque([]), idempotent_queue={}, wakeup=, done=True, lock=)) deliver_worker_fn_result = .deliver_worker_fn_result at 0x7f0e68710280> limiter = name = 'trio.to_thread.run_sync-1' placeholder = ThreadPlaceholder(name='trio.to_thread.run_sync-1') report_back_in_trio_thread_fn = .report_back_in_trio_thread_fn at 0x7f0e68710310> sync_fn = functools.partial(>, 'GET', 'http://bing.com/', timeout=100, verify=True, cert=None, allow_redirects=True, headers={}, stream=True) task_register = [.do' at 0x7f0e6798b040>] worker_fn = .worker_fn at 0x7f0e687101f0> /usr/lib/python3.8/site-packages/trio/_core/_traps.py:166: in wait_task_rescheduled return (await _async_yield(WaitTaskRescheduled(abort_func))).unwrap() abort_func = .abort at 0x7f0e6843eb80> /usr/lib/python3.8/site-packages/outcome/_impl.py:138: in unwrap raise captured_error /usr/lib/python3.8/site-packages/trio/_threads.py:161: in do_release_then_return_result return result.unwrap() limiter = placeholder = ThreadPlaceholder(name='trio.to_thread.run_sync-1') result = Error(ConnectionError(MaxRetryError("HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))"))) /usr/lib/python3.8/site-packages/outcome/_impl.py:138: in unwrap raise captured_error /usr/lib/python3.8/site-packages/trio/_threads.py:175: in worker_fn ret = sync_fn(*args) args = () current_trio_token = TrioToken(_reentry_queue=EntryQueue(queue=deque([]), idempotent_queue={}, wakeup=, done=True, lock=)) sync_fn = functools.partial(>, 'GET', 'http://bing.com/', timeout=100, verify=True, cert=None, allow_redirects=True, headers={}, stream=True) /usr/lib/python3.8/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) allow_redirects = True auth = None cert = None cookies = None data = None files = None headers = {} hooks = None json = None method = 'GET' params = None prep = proxies = {} req = self = send_kwargs = {'allow_redirects': True, 'cert': None, 'proxies': OrderedDict(), 'stream': True, 'timeout': 100, 'verify': True} settings = {'cert': None, 'proxies': OrderedDict(), 'stream': True, 'verify': True} stream = True timeout = 100 url = 'http://bing.com/' verify = True /usr/lib/python3.8/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) adapter = allow_redirects = True hooks = {'response': []} kwargs = {'cert': None, 'proxies': OrderedDict(), 'stream': True, 'timeout': 100, 'verify': True} request = self = start = 1667615907.3888144 stream = True _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = True timeout = Timeout(connect=100, read=100, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection(request.url, proxies) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: if not chunked: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, ) # Send the request. else: if hasattr(conn, "proxy_pool"): conn = conn.proxy_pool low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT) try: skip_host = "Host" in request.headers low_conn.putrequest( request.method, url, skip_accept_encoding=True, skip_host=skip_host, ) for header, value in request.headers.items(): low_conn.putheader(header, value) low_conn.endheaders() for i in request.body: low_conn.send(hex(len(i))[2:].encode("utf-8")) low_conn.send(b"\r\n") low_conn.send(i) low_conn.send(b"\r\n") low_conn.send(b"0\r\n\r\n") # Receive the response from the server r = low_conn.getresponse() resp = HTTPResponse.from_httplib( r, pool=conn, connection=low_conn, preload_content=False, decode_content=False, ) except Exception: # If we hit any problems here, clean up the connection. # Then, raise so that we can handle the actual exception. low_conn.close() raise except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E msrest.exceptions.ClientRequestError: Error occurred in request., ConnectionError: HTTPConnectionPool(host='bing.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')) cert = None chunked = False conn = proxies = OrderedDict() request = self = stream = True timeout = Timeout(connect=100, read=100, total=None) url = '/' verify = True /usr/lib/python3.8/site-packages/requests/adapters.py:565: ClientRequestError ------------------------------ Captured log call ------------------------------- WARNING urllib3.connectionpool:connectionpool.py:812 Retrying (Retry(total=2, connect=2, read=3, redirect=None, status=None)) after connection broken by 'NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': / WARNING urllib3.connectionpool:connectionpool.py:812 Retrying (Retry(total=1, connect=1, read=3, redirect=None, status=None)) after connection broken by 'NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': / WARNING urllib3.connectionpool:connectionpool.py:812 Retrying (Retry(total=0, connect=0, read=3, redirect=None, status=None)) after connection broken by 'NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': / =============================== warnings summary =============================== tests/asynctests/test_async_client.py:56 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_async_client.py:56: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_async_client.py:162 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_async_client.py:162: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_async_paging.py:43 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_async_paging.py:43: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_async_paging.py:67 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_async_paging.py:67: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_async_paging.py:92 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_async_paging.py:92: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_async_paging.py:119 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_async_paging.py:119: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_async_paging.py:158 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_async_paging.py:158: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_pipeline.py:52 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_pipeline.py:52: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_pipeline.py:78 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_pipeline.py:78: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_pipeline.py:91 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_pipeline.py:91: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_pipeline.py:103 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_pipeline.py:103: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_polling.py:40 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_polling.py:40: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_polling.py:60 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_polling.py:60: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_polling.py:116 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_polling.py:116: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_polling.py:149 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_polling.py:149: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_universal_http.py:46 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_universal_http.py:46: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_universal_http.py:57 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_universal_http.py:57: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/asynctests/test_universal_http.py:67 /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/tests/asynctests/test_universal_http.py:67: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo? You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html @pytest.mark.asyncio tests/test_auth.py::TestAuthentication::test_apikey_auth tests/test_auth.py::TestAuthentication::test_cs_auth tests/test_auth.py::TestAuthentication::test_eventgrid_auth tests/test_auth.py::TestAuthentication::test_eventgrid_domain_auth /usr/lib/python3.8/unittest/case.py:1215: DeprecationWarning: assertDictContainsSubset is deprecated warnings.warn('assertDictContainsSubset is deprecated', tests/test_client.py::TestServiceClient::test_client_request tests/test_client.py::TestServiceClient::test_deprecated_creds tests/test_runtime.py::TestRuntimeRetry::test_request_retry_3_times tests/test_runtime.py::TestRuntimeRetry::test_request_retry_404 tests/test_runtime.py::TestRuntimeRetry::test_request_retry_408 tests/test_runtime.py::TestRuntimeRetry::test_request_retry_501 tests/test_runtime.py::TestRuntimeRetry::test_request_retry_502 tests/test_runtime.py::TestRuntimeRetry::test_request_retry_505 tests/test_runtime.py::TestRuntimeRetry::test_request_retry_max /var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1/msrest/service_client.py:259: DeprecationWarning: Creds parameter is deprecated. Set config.credentials instead. warnings.warn("Creds parameter is deprecated. Set config.credentials instead.", tests/test_runtime.py: 14 warnings /usr/lib/python3.8/site-packages/urllib3/util/retry.py:455: DeprecationWarning: Using 'method_whitelist' with Retry is deprecated and will be removed in v2.0. Use 'allowed_methods' instead warnings.warn( tests/test_runtime.py::TestRuntimeRetry::test_request_retry_3_times tests/test_runtime.py::TestRuntimeRetry::test_request_retry_408 tests/test_runtime.py::TestRuntimeRetry::test_request_retry_502 tests/test_runtime.py::TestRuntimeRetry::test_request_retry_max tests/asynctests/test_pipeline.py::test_conf_async_trio_requests tests/asynctests/test_universal_http.py::test_conf_async_trio_requests /usr/lib/python3.8/site-packages/urllib3/util/retry.py:328: DeprecationWarning: Using 'method_whitelist' with Retry is deprecated and will be removed in v2.0. Use 'allowed_methods' instead warnings.warn( tests/test_runtime.py::TestRuntimeRetry::test_request_retry_3_times tests/test_runtime.py::TestRuntimeRetry::test_request_retry_408 tests/test_runtime.py::TestRuntimeRetry::test_request_retry_502 tests/test_runtime.py::TestRuntimeRetry::test_request_retry_max tests/asynctests/test_pipeline.py::test_conf_async_trio_requests tests/asynctests/test_universal_http.py::test_conf_async_trio_requests /usr/lib/python3.8/site-packages/urllib3/util/retry.py:338: DeprecationWarning: Using 'method_whitelist' with Retry is deprecated and will be removed in v2.0. Use 'allowed_methods' instead return type(self)(**params) tests/asynctests/test_async_client.py: 2 warnings tests/asynctests/test_async_paging.py: 5 warnings tests/asynctests/test_pipeline.py: 4 warnings tests/asynctests/test_polling.py: 4 warnings tests/asynctests/test_universal_http.py: 3 warnings /usr/lib/python3.8/site-packages/_pytest/python.py:184: PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped. You need to install a suitable plugin for your async framework, for example: - anyio - pytest-asyncio - pytest-tornasync - pytest-trio - pytest-twisted warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid))) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html ============================= slowest 10 durations ============================= 4.84s call tests/asynctests/test_pipeline.py::test_conf_async_trio_requests 4.83s call tests/asynctests/test_universal_http.py::test_conf_async_trio_requests 1.00s call tests/test_polling.py::test_poller 0.07s call tests/test_runtime.py::TestRedirect::test_request_redirect_get 0.06s call tests/test_runtime.py::TestRuntime::test_request_proxy 0.06s call tests/test_client.py::TestServiceClient::test_client_send 0.05s call tests/test_runtime.py::TestRedirect::test_request_redirect_post 0.05s call tests/test_runtime.py::TestRedirect::test_request_redirect_head 0.05s call tests/test_runtime.py::TestRedirect::test_request_redirect_delete 0.05s call tests/test_client.py::TestServiceClient::test_format_url =========================== short test summary info ============================ SKIPPED [18] ../../../../../../../usr/lib/python3.8/site-packages/_pytest/python.py:185: async def function and no async plugin installed (see warnings) FAILED tests/asynctests/test_pipeline.py::test_conf_async_trio_requests - msr... FAILED tests/asynctests/test_universal_http.py::test_conf_async_trio_requests =========== 2 failed, 190 passed, 18 skipped, 75 warnings in 15.01s ============ * ERROR: dev-python/msrest-0.7.1::guru failed (test phase): * pytest failed with python3.8 * * Call stack: * ebuild.sh, line 122: Called src_test * environment, line 3524: Called distutils-r1_src_test * environment, line 1700: Called _distutils-r1_run_foreach_impl 'python_test' * environment, line 778: Called python_foreach_impl 'distutils-r1_run_phase' 'python_test' * environment, line 3200: Called multibuild_foreach_variant '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' * environment, line 2666: Called _multibuild_run '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' * environment, line 2664: Called _python_multibuild_wrapper 'distutils-r1_run_phase' 'python_test' * environment, line 1114: Called distutils-r1_run_phase 'python_test' * environment, line 1623: Called python_test * environment, line 3491: Called distutils-r1_python_test * environment, line 1573: Called epytest * environment, line 2178: Called die * The specific snippet of code: * "${@}" || die -n "pytest failed with ${EPYTHON}"; * * If you need support, post the output of `emerge --info '=dev-python/msrest-0.7.1::guru'`, * the complete build log and the output of `emerge -pqv '=dev-python/msrest-0.7.1::guru'`. * The complete build log is located at '/var/log/emerge-log/build/dev-python/msrest-0.7.1:20221105-023803.log'. * For convenience, a symlink to the build log is located at '/var/tmp/portage/dev-python/msrest-0.7.1/temp/build.log'. * The ebuild environment file is located at '/var/tmp/portage/dev-python/msrest-0.7.1/temp/environment'. * Working directory: '/var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1' * S: '/var/tmp/portage/dev-python/msrest-0.7.1/work/msrest-0.7.1'