* Package: dev-python/asttokens-2.0.5 * Repository: gentoo * Maintainer: python@gentoo.org * USE: abi_x86_64 amd64 elibc_musl kernel_linux python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 test userland_GNU * FEATURES: network-sandbox preserve-libs sandbox test userpriv usersandbox @@@@@ PLEASE PAY ATTENTION HERE!!! @@@@@ This information may help you to understand if this is a duplicate or if this bug exists after you pushed a fix; This ebuild was merged at the following commit: https://github.com/gentoo-mirror/gentoo/commit/3fbacbc67093e132cf2e2b6324b8612ad0a67177 (Fri Mar 4 01:34:55 UTC 2022) @@@@@ END @@@@@ ################## # emerge --info: # ################## Portage 3.0.30 (python 3.10.2-final-0, default/linux/amd64/17.0/musl/hardened, gcc-11.2.1, musl-1.2.2-r8, 4.19.174-gentoo x86_64) ================================================================= System uname: Linux-4.19.174-gentoo-x86_64-Intel-R-_Xeon-R-_CPU_E5-2650_v4_@_2.20GHz-with-libc KiB Mem: 264046488 total, 200092992 free KiB Swap: 0 total, 0 free Timestamp of repository gentoo: Fri, 04 Mar 2022 01:07:08 +0000 sh bash 5.1_p16 ld GNU ld (Gentoo 2.37_p1 p2) 2.37 app-misc/pax-utils: 1.3.3::gentoo app-shells/bash: 5.1_p16::gentoo dev-lang/perl: 5.34.0-r8::gentoo dev-lang/python: 3.8.12_p1-r2::gentoo, 3.9.10-r1::gentoo, 3.10.2-r1::gentoo dev-util/cmake: 3.22.2::gentoo dev-util/meson: 0.61.1::gentoo sys-apps/baselayout: 2.8::gentoo sys-apps/openrc: 0.44.10::gentoo sys-apps/sandbox: 2.29::gentoo sys-devel/autoconf: 2.71-r1::gentoo sys-devel/automake: 1.16.5::gentoo sys-devel/binutils: 2.37_p1-r2::gentoo sys-devel/binutils-config: 5.4.1::gentoo sys-devel/gcc: 11.2.1_p20220115::gentoo sys-devel/gcc-config: 2.5-r1::gentoo sys-devel/libtool: 2.4.6-r6::gentoo sys-devel/make: 4.3::gentoo sys-kernel/linux-headers: 5.16::gentoo (virtual/os-headers) sys-libs/musl: 1.2.2-r8::gentoo Repositories: gentoo location: /usr/portage sync-type: rsync sync-uri: rsync://rsync.gentoo.org/gentoo-portage priority: -1000 sync-rsync-verify-metamanifest: yes sync-rsync-extra-opts: sync-rsync-verify-jobs: 1 sync-rsync-verify-max-age: 24 ACCEPT_KEYWORDS="amd64 ~amd64" ACCEPT_LICENSE="* Apache-2.0" CBUILD="x86_64-gentoo-linux-musl" CFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" CHOST="x86_64-gentoo-linux-musl" CONFIG_PROTECT="/etc /usr/share/gnupg/qualified.txt" CONFIG_PROTECT_MASK="/etc/ca-certificates.conf /etc/env.d /etc/fonts/fonts.conf /etc/gconf /etc/gentoo-release /etc/revdep-rebuild /etc/sandbox.d /etc/terminfo" CXXFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" DISTDIR="/var/tmp/portage/dev-python/asttokens-2.0.5/distdir" EMERGE_DEFAULT_OPTS="--with-bdeps=y -1 -k -b" ENV_UNSET="CARGO_HOME DBUS_SESSION_BUS_ADDRESS DISPLAY GOBIN GOPATH PERL5LIB PERL5OPT PERLPREFIX PERL_CORE PERL_MB_OPT PERL_MM_OPT XAUTHORITY XDG_CACHE_HOME XDG_CONFIG_HOME XDG_DATA_HOME XDG_RUNTIME_DIR" FCFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" FEATURES="assume-digests binpkg-docompress binpkg-dostrip binpkg-logs binpkg-multi-instance buildpkg buildpkg-live config-protect-if-modified distlocks ebuild-locks fixlafiles ipc-sandbox merge-sync network-sandbox news parallel-fetch pid-sandbox preserve-libs protect-owned qa-unresolved-soname-deps sandbox sfperms sign split-log strict test unknown-features-warn unmerge-logs unmerge-orphans userfetch userpriv usersandbox usersync xattr" FFLAGS="-O2 -pipe -march=x86-64 -frecord-gcc-switches -fno-diagnostics-color -fmessage-length=0" GENTOO_MIRRORS="http://mirror.leaseweb.com/gentoo/ http://ftp.snt.utwente.nl/pub/os/linux/gentoo/ http://ftp.belnet.be/pub/rsync.gentoo.org/gentoo/ http://distfiles.gentoo.org" INSTALL_MASK="charset.alias /usr/share/locale/locale.alias" LANG="C.UTF8" LDFLAGS="-Wl,-O1 -Wl,--as-needed -Wl,--defsym=__gentoo_check_ldflags__=0" MAKEOPTS="-j16" PKGDIR="/root/tbci/binpkg" PORTAGE_CONFIGROOT="/" PORTAGE_RSYNC_OPTS="--recursive --links --safe-links --perms --times --omit-dir-times --compress --force --whole-file --delete --stats --human-readable --timeout=180 --exclude=/distfiles --exclude=/local --exclude=/packages --exclude=/.git" PORTAGE_TMPDIR="/var/tmp" SHELL="/bin/bash" USE="acl amd64 bzip2 crypt elogind hardened iconv ipv6 jumbo-build libglvnd libtirpc native-symlinks ncurses nls nptl openmp pam pcre pie readline seccomp split-usr ssl ssp test unicode xattr xtpax zlib" ABI_X86="64" ELIBC="musl" KERNEL="linux" PYTHON_TARGETS="python3_8 python3_9 python3_10" USERLAND="GNU" Unset: ADDR2LINE, AR, ARFLAGS, AS, ASFLAGS, CC, CCLD, CONFIG_SHELL, CPP, CPPFLAGS, CTARGET, CXX, CXXFILT, ELFEDIT, EXTRA_ECONF, F77FLAGS, FC, GCOV, GPROF, LC_ALL, LD, LEX, LFLAGS, LIBTOOL, LINGUAS, MAKE, MAKEFLAGS, NM, OBJCOPY, OBJDUMP, PORTAGE_BINHOST, PORTAGE_BUNZIP2_COMMAND, PORTAGE_COMPRESS, PORTAGE_COMPRESS_FLAGS, PORTAGE_RSYNC_EXTRA_OPTS, RANLIB, READELF, RUSTFLAGS, SIZE, STRINGS, STRIP, YACC, YFLAGS ############################## # emerge history (qlop -mv): # ############################## 2022-03-04T04:21:27 >>> dev-python/typing-extensions-4.1.1 2022-03-04T04:21:29 >>> dev-python/six-1.16.0 2022-03-04T04:21:32 >>> dev-python/pluggy-1.0.0-r1 2022-03-04T04:21:30 >>> dev-python/iniconfig-1.1.1 2022-03-04T04:21:34 >>> dev-python/py-1.11.0-r1 2022-03-04T04:21:37 >>> dev-python/wrapt-1.13.2-r1 2022-03-04T04:21:35 >>> dev-python/lazy-object-proxy-1.7.1 2022-03-04T04:21:39 >>> dev-python/namespace-zope-1-r1 2022-03-04T04:22:08 >>> dev-python/zope-interface-5.4.0 2022-03-04T04:22:16 >>> dev-python/attrs-21.4.0 2022-03-04T04:22:22 >>> dev-python/pytest-7.0.1 2022-03-04T04:22:06 >>> dev-python/astroid-2.10.0 ####################################### # installed packages (qlist -ICvUSS): # ####################################### acct-group/audio-0-r1:0 acct-group/cdrom-0-r1:0 acct-group/dialout-0-r1:0 acct-group/disk-0-r1:0 acct-group/input-0-r1:0 acct-group/kmem-0-r1:0 acct-group/kvm-0-r1:0 acct-group/lp-0-r1:0 acct-group/man-0-r1:0 acct-group/messagebus-0-r1:0 acct-group/portage-0:0 acct-group/render-0-r1:0 acct-group/sgx-0:0 acct-group/sshd-0-r1:0 acct-group/tape-0-r1:0 acct-group/tty-0-r1:0 acct-group/video-0-r1:0 acct-user/man-1-r1:0 acct-user/messagebus-0-r1:0 acct-user/portage-0:0 acct-user/sshd-0-r1:0 app-admin/eselect-1.4.20:0 -doc -emacs -vim-syntax app-admin/perl-cleaner-2.30:0 app-arch/bzip2-1.0.8-r1:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 split-usr -static -static-libs app-arch/gzip-1.11:0 -pic -static app-arch/libarchive-3.6.0:0/13 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 acl -blake2 bzip2 e2fsprogs -expat iconv -lz4 lzma -lzo -nettle -static-libs xattr -zstd app-arch/tar-1.34:0 acl -minimal nls -selinux xattr app-arch/unzip-6.0_p26:0 bzip2 -natspec unicode app-arch/xz-utils-5.2.5-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 extra-filters nls split-usr -static-libs app-arch/zstd-1.5.2:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -lz4 -static-libs threads app-crypt/gnupg-2.3.4-r1:0 bzip2 -doc -ldap nls readline -selinux smartcard ssl -test tofu -tools -tpm -usb -user-socket -verify-sig -wks-server app-crypt/gpgme-1.17.0-r1:1/11.6.7 -common-lisp cxx -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -qt5 -static-libs app-crypt/libb2-0.98.1-r3:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -native-cflags openmp -static-libs app-crypt/libmd-1.0.4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 app-crypt/pinentry-1.2.0:0 -caps -efl -emacs -gnome-keyring -gtk ncurses -qt5 app-crypt/rhash-1.4.2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug nls ssl -static-libs app-editors/nano-6.2:0 -debug -justify -magic -minimal ncurses nls spell split-usr -static unicode app-eselect/eselect-fontconfig-1.1-r1:0 app-eselect/eselect-iptables-20211203:0 app-eselect/eselect-lib-bin-symlink-0.1.1-r1:0 app-eselect/eselect-pinentry-0.7.2:0 app-misc/c_rehash-1.7-r1:0 app-misc/ca-certificates-20211016.3.72:0 -cacert app-misc/editor-wrapper-4-r1:0 app-misc/mime-types-2.1.53:0 -nginx app-misc/pax-utils-1.3.3:0 -caps -debug -python -python_single_target_python3_10 -python_single_target_python3_8 python_single_target_python3_9 seccomp app-misc/tmux-3.2a:0 -debug -selinux -utempter -vim-syntax app-portage/eix-0.36.1:0 -debug -doc nls -sqlite app-portage/elt-patches-20211104:0 app-portage/gemato-16.2:0 gpg -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test -tools app-portage/gentoolkit-0.5.1-r1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test app-portage/portage-utils-0.93.3:0 nls openmp qmanifest qtegrity -static app-shells/bash-5.1_p16:0 -afs -bashlogger -examples -mem-scramble net nls -plugins readline app-shells/push-3.4:0 app-shells/quoter-4.2:0 app-text/ansifilter-2.18:0 -qt5 app-text/build-docbook-catalog-2.2:0 app-text/docbook-xml-dtd-4.5-r2:4.5 app-text/docbook-xml-dtd-4.4-r3:4.4 app-text/docbook-xml-dtd-4.2-r3:4.2 app-text/docbook-xml-dtd-4.1.2-r7:4.1.2 app-text/docbook-xsl-stylesheets-1.79.1-r2:0 -ruby app-text/manpager-1:0 app-text/sgml-common-0.6.3-r7:0 app-text/xmlto-0.0.28-r8:0 -latex -text dev-db/sqlite-3.38.0:3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -doc -icu readline -secure-delete -static-libs -tcl -test -tools dev-lang/perl-5.34.0-r8:0/5.34 -berkdb -debug -doc -gdbm ithreads -minimal -quadmath dev-lang/python-3.10.2-r1:3.10 -bluetooth -build -examples gdbm hardened -libedit -lto ncurses -pgo readline sqlite ssl -test -tk -verify-sig -wininst xml dev-lang/python-3.9.10-r1:3.9 -bluetooth -build -examples gdbm hardened -lto ncurses -pgo readline sqlite ssl -test -tk -verify-sig -wininst xml dev-lang/python-3.8.12_p1-r2:3.8 -bluetooth -build -examples gdbm hardened ncurses readline sqlite ssl -test -tk -verify-sig -wininst xml dev-lang/python-exec-2.4.8:2 native-symlinks python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-lang/python-exec-conf-2.4.6:2 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 dev-lang/tcl-8.6.12:0/8.6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug threads dev-libs/elfutils-0.186:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 -lzma nls -static-libs -test -threads utils -valgrind -zstd dev-libs/expat-2.4.6:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -examples -static-libs unicode dev-libs/glib-2.70.4:2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -dbus -debug elf -fam -gtk-doc mime -selinux -static-libs -sysprof -systemtap -test -utils xattr dev-libs/gmp-6.2.1-r2:0/10.4 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm cxx -doc -pic -static-libs dev-libs/gobject-introspection-1.70.0:0 -doctool -gtk-doc -python_single_target_python3_10 -python_single_target_python3_8 python_single_target_python3_9 -test dev-libs/gobject-introspection-common-1.70.0:0 dev-libs/isl-0.24-r2:0/23 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs dev-libs/jsoncpp-1.9.5:0/25 -doc -test dev-libs/libassuan-2.5.5:0 dev-libs/libbsd-0.11.5:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs dev-libs/libevent-2.1.12:0/2.1-7 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 clock-gettime -debug -malloc-replacement ssl -static-libs -test threads -verbose-debug dev-libs/libffi-3.4.2-r1:0/8 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -exec-static-trampoline -pax-kernel -static-libs -test dev-libs/libgcrypt-1.9.4-r1:0/20 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm -cpu_flags_arm_aes -cpu_flags_arm_neon -cpu_flags_arm_sha1 -cpu_flags_arm_sha2 -cpu_flags_ppc_altivec -cpu_flags_ppc_vsx2 -cpu_flags_ppc_vsx3 cpu_flags_x86_aes cpu_flags_x86_avx cpu_flags_x86_avx2 -cpu_flags_x86_padlock -cpu_flags_x86_sha cpu_flags_x86_sse4_1 -doc -o-flag-munging -static-libs -verify-sig dev-libs/libgpg-error-1.44:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -common-lisp nls -static-libs dev-libs/libksba-1.6.0:0 -static-libs dev-libs/libltdl-2.4.6:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs dev-libs/libpcre-8.45:3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 cxx -jit -libedit pcre16 pcre32 readline recursion-limit split-usr -static-libs unicode zlib dev-libs/libpcre2-10.39:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 -jit -libedit pcre16 pcre32 readline recursion-limit split-usr -static-libs unicode zlib dev-libs/libpipeline-1.5.5:0 -test dev-libs/libtasn1-4.18.0:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs -test -valgrind dev-libs/libunistring-1.0:0/2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -static-libs dev-libs/libuv-1.43.0:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 dev-libs/libxml2-2.9.13-r1:2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -examples -icu -lzma python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 readline -static-libs -test dev-libs/libxslt-1.1.35:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 crypt -debug -examples -static-libs dev-libs/lzo-2.10:2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -examples split-usr -static-libs dev-libs/mpc-1.2.1:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs dev-libs/mpfr-4.1.0_p13-r1:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs dev-libs/nettle-3.7.3:0/8-6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm -cpu_flags_arm_neon cpu_flags_x86_aes -cpu_flags_x86_sha -doc gmp -static-libs -test dev-libs/npth-1.6-r1:0 dev-libs/openssl-1.1.1m:0/1.1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 asm cpu_flags_x86_sse2 -rfc3779 -sctp -sslv3 -static-libs -test -tls-compression -tls-heartbeat -vanilla dev-libs/popt-1.18:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls -static-libs dev-perl/Devel-CheckLib-1.140.0:0 -test dev-perl/Encode-Locale-1.50.0-r1:0 -test dev-perl/File-Listing-6.140.0:0 -test dev-perl/HTML-Parser-3.760.0:0 -test dev-perl/HTML-Tagset-3.200.0-r2:0 dev-perl/HTTP-Cookies-6.100.0:0 -test dev-perl/HTTP-Date-6.50.0:0 dev-perl/HTTP-Message-6.330.0:0 -test -test dev-perl/HTTP-Negotiate-6.10.0-r2:0 -test dev-perl/IO-HTML-1.4.0:0 -test dev-perl/IO-Socket-INET6-2.720.0-r2:0 -test dev-perl/IO-Socket-SSL-2.74.0:0 -examples -idn -test dev-perl/libwww-perl-6.600.0-r1:0 ssl -test dev-perl/Locale-gettext-1.70.0-r1:0 -test dev-perl/LWP-MediaTypes-6.40.0:0 -test dev-perl/LWP-Protocol-https-6.100.0:0 -test dev-perl/Module-Build-0.423.100:0 -test dev-perl/Mozilla-CA-20999999-r1:0 -test dev-perl/Net-HTTP-6.210.0:0 -minimal -test dev-perl/Net-SSLeay-1.900.0:0 -examples -examples -minimal -test dev-perl/Socket6-0.290.0:0 -test dev-perl/TimeDate-2.330.0-r1:0 -test dev-perl/Try-Tiny-0.310.0:0 -minimal -test dev-perl/URI-5.100.0:0 -test dev-perl/WWW-RobotRules-6.20.0-r2:0 -test dev-perl/XML-Parser-2.460.0-r2:0 dev-python/appdirs-1.4.4-r2:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 dev-python/astroid-2.10.0:0 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/attrs-21.4.0:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/certifi-3021.3.16-r1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/charset_normalizer-2.0.12:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/flit_core-3.7.1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/idna-3.3:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 dev-python/importlib_metadata-4.11.2:0 -doc -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/importlib_resources-5.4.0-r3:0 -doc -python_targets_pypy3 python_targets_python3_8 -test dev-python/iniconfig-1.1.1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/installer-0.5.0:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/jaraco-context-4.1.1-r1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/jaraco-functools-3.5.0-r1:0 -doc -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/jaraco-text-3.7.0-r1:0 -doc -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/jinja-3.0.3:0 -doc -examples -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/lazy-object-proxy-1.7.1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/markupsafe-2.1.0:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/more-itertools-8.12.0-r1:0 -doc -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/namespace-zope-1-r1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 dev-python/ordered-set-4.1.0:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/packaging-21.3-r2:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/pluggy-1.0.0-r1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/py-1.11.0-r1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/pyparsing-3.0.7-r1:0 -examples -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/pypax-0.9.5-r1:0 -ptpax python_targets_python3_8 python_targets_python3_9 xtpax dev-python/PySocks-1.7.1-r1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 dev-python/pytest-7.0.1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/requests-2.27.1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -socks5 -test dev-python/setuptools-60.9.2:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/setuptools_scm-6.4.1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/six-1.16.0:0 -doc -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/tomli-2.0.1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/typing-extensions-4.1.1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 dev-python/urllib3-1.26.8:0 -brotli -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/wheel-0.37.1-r1:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/wrapt-1.13.2-r1:0 -doc -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/zipp-3.7.0-r1:0 -doc -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-python/zope-interface-5.4.0:0 -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-util/cmake-3.22.2:0 -doc -emacs ncurses -qt5 -test -test dev-util/desktop-file-utils-0.26-r1:0 -emacs dev-util/glib-utils-2.70.4:0 -python_single_target_python3_10 -python_single_target_python3_8 python_single_target_python3_9 dev-util/gperf-3.1:0 dev-util/gtk-doc-am-1.33.2:0 dev-util/intltool-0.51.0-r2:0 dev-util/meson-0.61.1:0 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -test dev-util/meson-format-array-0:0 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 dev-util/ninja-1.10.2-r1:0 -doc -emacs -test -vim-syntax dev-util/pkgconf-1.8.0-r1:0/3 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -test dev-util/re2c-2.2:0 -debug -test dev-vcs/git-2.35.1:0 blksha1 -cgi curl -cvs -doc -emacs -gnome-keyring gpg -highlight iconv -mediawiki -mediawiki-experimental nls pcre -perforce -perl -ppcsha1 -python_single_target_python3_10 -python_single_target_python3_8 python_single_target_python3_9 -subversion -test threads -tk webdav -xinetd media-fonts/liberation-fonts-2.1.3:0 -X -X -fontforge media-gfx/graphite2-1.3.14_p20210810-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -perl -test media-libs/fontconfig-2.13.1-r2:1.0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -doc -static-libs media-libs/freetype-2.11.1:2 -X -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 adobe-cff -brotli bzip2 cleartype-hinting -debug -doc -fontforge harfbuzz -infinality png -static-libs -utils media-libs/harfbuzz-4.0.0:0/4.0.0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 cairo -debug -doc -experimental glib graphite -icu introspection -test truetype media-libs/libpng-1.6.37-r2:0/16 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -apng -cpu_flags_arm_neon cpu_flags_x86_sse -static-libs net-dns/libidn2-2.3.2:0/2 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs -verify-sig net-firewall/iptables-1.8.7-r1:0/1.8.3 -conntrack -netlink -nftables -pcap split-usr -static-libs net-libs/gnutls-3.7.3:0/30 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 cxx -dane -doc -examples -guile idn nls openssl -pkcs11 seccomp -sslv2 -sslv3 -static-libs -test -test-full tls-heartbeat -tools -valgrind net-libs/libmnl-1.0.4:0/0.2.0 -examples split-usr -static-libs net-libs/nghttp2-1.47.0:0/1.14 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cxx -debug -hpack-tools -jemalloc -static-libs -test threads -utils -xml net-misc/curl-7.81.0:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -adns -alt-svc -brotli -curl_ssl_gnutls -curl_ssl_mbedtls -curl_ssl_nss curl_ssl_openssl ftp -gnutls -gopher -hsts http2 -idn imap ipv6 -kerberos -ldap -mbedtls -nghttp3 -nss openssl pop3 progress-meter -quiche -rtmp -samba smtp -ssh ssl -sslv3 -static-libs -telnet -test tftp -threads -zstd net-misc/iputils-20210722-r1:0 arping -caps -clockdiff -doc filecaps -gcrypt -idn -nettle nls -rarpd -rdisc ssl -static -test -tftpd -tracepath -traceroute6 net-misc/netifrc-0.7.3:0 net-misc/openssh-8.9_p1-r1:0 -X -X509 -abi_mips_n32 -audit -debug -hpn -kerberos -ldns -libedit -livecd pam pie scp -sctp -security-key -selinux ssl -static -test -xmss net-misc/rsync-3.2.3-r5:0 acl -examples iconv ipv6 -lz4 ssl -stunnel -system-zlib xattr -xxhash -zstd net-misc/wget-1.21.3:0 -cookie-check -debug -gnutls -idn ipv6 -metalink nls -ntlm pcre ssl -static -test -uuid zlib perl-core/CPAN-2.290.0-r1:0 perl-core/Encode-3.120.0:0 perl-core/File-Temp-0.231.100:0 perl-core/Scalar-List-Utils-1.560.0:0 sec-keys/openpgp-keys-gentoo-release-20220101:0 -test sys-apps/acl-2.3.1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls split-usr -static-libs sys-apps/attr-2.5.1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug nls split-usr -static-libs sys-apps/baselayout-2.8:0 -build split-usr sys-apps/coreutils-9.0-r2:0 acl -caps -gmp -hostname -kill -multicall nls -selinux split-usr -static -test -vanilla xattr sys-apps/dbus-1.12.22:0 -X -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -debug -doc elogind -selinux -static-libs -systemd -test -test sys-apps/debianutils-5.5:0 installkernel -static sys-apps/diffutils-3.8:0 nls -static sys-apps/elfix-0.9.5:0 -ptpax xtpax sys-apps/file-5.41:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 bzip2 -lzma -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -seccomp -static-libs zlib sys-apps/findutils-4.9.0:0 nls -selinux -static -test sys-apps/gawk-5.1.1-r1:0 -mpfr nls readline sys-apps/gentoo-functions-0.15:0 sys-apps/grep-3.7:0 nls pcre -static sys-apps/groff-1.22.4:0 -X -examples -uchardet sys-apps/help2man-1.48.5:0 nls sys-apps/install-xattr-0.8:0 sys-apps/iproute2-5.16.0:0 -atm -berkdb -bpf -caps -elf iptables -libbsd -minimal -selinux sys-apps/kbd-2.4.0:0 nls pam -test sys-apps/kmod-29:0 -debug -doc lzma -pkcs7 -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -static-libs tools zlib -zstd sys-apps/less-590:0 pcre unicode sys-apps/man-db-2.10.1:0 -berkdb manpager -nls seccomp -selinux -static-libs zlib sys-apps/net-tools-2.10:0 arp hostname ipv6 -nis nls -plipconfig -selinux -slattach -static sys-apps/openrc-0.44.10:0 -audit -bash -debug ncurses netifrc -newnet pam -selinux -sysv-utils unicode sys-apps/portage-3.0.30-r1:0 -apidoc -build -doc -gentoo-dev ipc native-extensions -python_targets_pypy3 python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 rsync-verify -selinux -test xattr sys-apps/sandbox-2.29:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nnp sys-apps/sed-4.8:0 acl nls -selinux -static sys-apps/shadow-4.11.1:0/4 acl -audit -bcrypt -cracklib nls pam -selinux -skey split-usr -su xattr sys-apps/systemd-tmpfiles-249.9:0 -selinux -test sys-apps/sysvinit-3.01:0 -ibm -selinux -static sys-apps/texinfo-6.8:0 nls standalone -static sys-apps/util-linux-2.37.4:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -audit -build -caps cramfs -cryptsetup -fdformat hardlink -kill logger -magic ncurses nls pam -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 readline -rtas -selinux -slang split-usr -static-libs su suid -systemd -test -tty-helpers -udev unicode sys-apps/which-2.21:0 sys-auth/elogind-246.10-r2:0 acl -audit cgroup-hybrid -debug -doc pam -policykit -selinux sys-auth/pambase-20220214:0 -caps -debug elogind -gnome-keyring -homed -minimal -mktemp nullok -pam_krb5 -pam_ssh passwdqc -pwhistory -pwquality -securetty -selinux sha512 -systemd -yescrypt sys-auth/passwdqc-2.0.2-r1:0 sys-devel/autoconf-2.71-r1:2.71 -emacs sys-devel/autoconf-archive-2022.02.11:0 sys-devel/autoconf-wrapper-20220130:0 sys-devel/automake-1.16.5:1.16 -test sys-devel/automake-wrapper-11:0 sys-devel/binutils-2.37_p1-r2:2.37 -cet -default-gold -doc gold -multitarget nls -pgo plugins -static-libs -test -vanilla sys-devel/binutils-config-5.4.1:0 native-symlinks sys-devel/bison-3.8.2:0 -examples nls -static -test sys-devel/flex-2.6.4-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 nls -static -test sys-devel/gcc-11.2.1_p20220115:11 -ada -cet -custom-cflags cxx -d -debug -doc -fixed-point -fortran -go graphite hardened -jit -libssp lto -multilib -nls nptl -objc -objc++ -objc-gc openmp -pch -pgo pie -sanitize ssp -systemtap -test -valgrind -vanilla -vtv -zstd sys-devel/gcc-config-2.5-r1:0 cc-wrappers native-symlinks sys-devel/gettext-0.21-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 acl -cvs cxx -doc -emacs -git -java -java ncurses -nls openmp -static-libs sys-devel/gnuconfig-20210107:0 sys-devel/libtool-2.4.6-r6:2 -vanilla sys-devel/m4-1.4.19:0 -examples nls sys-devel/make-4.3:0 -guile nls -static sys-devel/patch-2.7.6-r4:0 -static -test xattr sys-fs/e2fsprogs-1.46.5:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cron -fuse -lto -nls split-usr -static-libs threads tools sys-fs/udev-249.9:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 acl kmod -selinux split-usr -test sys-fs/udev-init-scripts-35:0 sys-kernel/installkernel-gentoo-5:0 -grub sys-kernel/linux-headers-5.16:0 -headers-only sys-libs/argp-standalone-1.4.1-r1:0 -static-libs sys-libs/binutils-libs-2.37_p1-r2:0/2.37 -64-bit-bfd -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cet -multitarget nls -static-libs sys-libs/fts-standalone-1.2.7:0 -static-libs sys-libs/gdbm-1.23:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 berkdb nls readline -static-libs sys-libs/libcap-2.63:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 pam split-usr -static-libs -tools sys-libs/libseccomp-2.5.3:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -python python_targets_python3_10 python_targets_python3_8 python_targets_python3_9 -static-libs -test sys-libs/musl-1.2.2-r8:0 -headers-only sys-libs/ncurses-6.3_p20211106:0/6 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -ada cxx -debug -doc -gpm -minimal -profile split-usr -static-libs -test tinfo -trace sys-libs/obstack-standalone-1.1:0 -static-libs sys-libs/pam-1.5.2-r1:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -audit -berkdb -debug -filecaps -nis -selinux sys-libs/readline-8.1_p2:0/8 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 split-usr -static-libs unicode -utils sys-libs/zlib-1.2.11-r4:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 minizip split-usr -static-libs sys-process/procps-3.3.17-r1:0/8 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 elogind kill -modern-top ncurses nls -selinux split-usr -static-libs -systemd -test unicode sys-process/psmisc-23.4-r1:0 -X ipv6 nls -selinux virtual/acl-0-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs virtual/awk-1:0 virtual/dev-manager-0-r2:0 virtual/editor-0-r3:0 virtual/libc-1-r1:0 virtual/libcrypt-1-r1:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -static-libs virtual/libelf-3:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 virtual/libiconv-0-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 virtual/libintl-0-r2:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 virtual/libudev-232-r5:0/1 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -systemd virtual/man-0-r4:0 virtual/os-headers-0-r2:0 virtual/package-manager-1:0 virtual/pager-0:0 virtual/perl-Carp-1.520.0-r1:0 virtual/perl-Compress-Raw-Bzip2-2.101.0:0 virtual/perl-Compress-Raw-Zlib-2.101.0:0 virtual/perl-CPAN-2.290.0:0 virtual/perl-CPAN-Meta-2.150.10-r5:0 virtual/perl-CPAN-Meta-Requirements-2.140.0-r7:0 virtual/perl-CPAN-Meta-YAML-0.18.0-r7:0 virtual/perl-Data-Dumper-2.179.0:0 virtual/perl-Digest-MD5-2.580.0:0 virtual/perl-Encode-3.120.0:0 virtual/perl-Exporter-5.760.0:0 virtual/perl-ExtUtils-CBuilder-0.280.236:0 virtual/perl-ExtUtils-Install-2.200.0:0 virtual/perl-ExtUtils-MakeMaker-7.620.0:0 virtual/perl-ExtUtils-Manifest-1.730.0:0 virtual/perl-ExtUtils-ParseXS-3.430.0:0 virtual/perl-File-Spec-3.800.0:0 virtual/perl-File-Temp-0.231.100:0 virtual/perl-Getopt-Long-2.520.0:0 virtual/perl-IO-1.460.0:0 virtual/perl-IO-Compress-2.102.0:0 virtual/perl-IO-Socket-IP-0.410.0:0 virtual/perl-JSON-PP-4.60.0:0 virtual/perl-libnet-3.130.0:0 ssl virtual/perl-MIME-Base64-3.160.0:0 virtual/perl-Module-Metadata-1.0.37-r1:0 virtual/perl-parent-0.238.0-r1:0 virtual/perl-Parse-CPAN-Meta-2.150.10-r5:0 virtual/perl-Perl-OSType-1.10.0-r5:0 virtual/perl-podlators-4.140.0-r2:0 virtual/perl-Scalar-List-Utils-1.560.0:0 virtual/perl-Test-Harness-3.430.0:0 virtual/perl-Text-ParseWords-3.300.0-r8:0 virtual/perl-Time-Local-1.300.0:0 virtual/perl-version-0.992.800:0 virtual/perl-XSLoader-0.300.0-r4:0 virtual/pkgconfig-2-r1:0 virtual/service-manager-1:0 virtual/ssh-0:0 -minimal virtual/tmpfiles-0-r1:0 virtual/ttf-fonts-1-r1:0 virtual/udev-217-r3:0 virtual/yacc-0:0 www-client/pybugz-0.13-r1:0 python_targets_python3_8 python_targets_python3_9 x11-libs/cairo-1.16.0-r5:0 -X -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -aqua -debug -gles2-only glib -opengl -static-libs svg -utils -valgrind x11-libs/pixman-0.40.0:0 -abi_mips_n32 -abi_mips_n64 -abi_mips_o32 -abi_s390_32 -abi_s390_64 -abi_x86_32 abi_x86_64 -abi_x86_x32 -cpu_flags_arm_iwmmxt -cpu_flags_arm_iwmmxt2 -cpu_flags_arm_neon -cpu_flags_ppc_altivec cpu_flags_x86_mmxext cpu_flags_x86_sse2 cpu_flags_x86_ssse3 -loongson2f -static-libs x11-misc/shared-mime-info-2.1:0 ####################### # build.log # ####################### >>> Unpacking source... >>> Unpacking asttokens-2.0.5.gh.tar.gz to /var/tmp/portage/dev-python/asttokens-2.0.5/work >>> Source unpacked in /var/tmp/portage/dev-python/asttokens-2.0.5/work >>> Preparing source in /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5 ... >>> Source prepared. >>> Configuring source in /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5 ... >>> Source configured. >>> Compiling source in /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5 ... * python3_8: running distutils-r1_run_phase distutils-r1_python_compile python3.8 setup.py build -j 16 running build running build_py creating /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_8/lib/asttokens copying asttokens/version.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_8/lib/asttokens copying asttokens/util.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_8/lib/asttokens copying asttokens/mark_tokens.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_8/lib/asttokens copying asttokens/line_numbers.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_8/lib/asttokens copying asttokens/asttokens.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_8/lib/asttokens copying asttokens/__init__.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_8/lib/asttokens warning: build_py: byte-compiling is disabled, skipping. * python3_9: running distutils-r1_run_phase distutils-r1_python_compile python3.9 setup.py build -j 16 running build running build_py creating /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_9/lib/asttokens copying asttokens/version.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_9/lib/asttokens copying asttokens/util.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_9/lib/asttokens copying asttokens/mark_tokens.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_9/lib/asttokens copying asttokens/line_numbers.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_9/lib/asttokens copying asttokens/asttokens.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_9/lib/asttokens copying asttokens/__init__.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_9/lib/asttokens warning: build_py: byte-compiling is disabled, skipping. * python3_10: running distutils-r1_run_phase distutils-r1_python_compile python3.10 setup.py build -j 16 running build running build_py creating /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_10/lib/asttokens copying asttokens/version.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_10/lib/asttokens copying asttokens/util.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_10/lib/asttokens copying asttokens/mark_tokens.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_10/lib/asttokens copying asttokens/line_numbers.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_10/lib/asttokens copying asttokens/asttokens.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_10/lib/asttokens copying asttokens/__init__.py -> /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5-python3_10/lib/asttokens warning: build_py: byte-compiling is disabled, skipping. >>> Source compiled. >>> Test phase: dev-python/asttokens-2.0.5 * python3_8: running distutils-r1_run_phase python_test python3.8 -m pytest -vv -ra -l -Wdefault --color=no -p no:cov -p no:flake8 -p no:flakes -p no:pylint --deselect tests/test_astroid.py::TestAstroid::test_slices ============================= test session starts ============================== platform linux -- Python 3.8.12, pytest-7.0.1, pluggy-1.0.0 -- /usr/bin/python3.8 cachedir: .pytest_cache rootdir: /var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5, configfile: setup.cfg collecting ... collected 107 items / 1 deselected / 106 selected tests/test_astroid.py::TestAstroid::test_adjacent_joined_strings <- tests/test_mark_tokens.py PASSED [ 0%] tests/test_astroid.py::TestAstroid::test_adjacent_strings <- tests/test_mark_tokens.py PASSED [ 1%] tests/test_astroid.py::TestAstroid::test_assert_nodes_equal <- tests/test_mark_tokens.py PASSED [ 2%] tests/test_astroid.py::TestAstroid::test_assignment_expressions <- tests/test_mark_tokens.py FAILED [ 3%] tests/test_astroid.py::TestAstroid::test_async_def <- tests/test_mark_tokens.py FAILED [ 4%] tests/test_astroid.py::TestAstroid::test_async_for_and_with <- tests/test_mark_tokens.py PASSED [ 5%] tests/test_astroid.py::TestAstroid::test_await <- tests/test_mark_tokens.py PASSED [ 6%] tests/test_astroid.py::TestAstroid::test_calling_lambdas <- tests/test_mark_tokens.py PASSED [ 7%] tests/test_astroid.py::TestAstroid::test_complex_numbers <- tests/test_mark_tokens.py PASSED [ 8%] tests/test_astroid.py::TestAstroid::test_complex_slice_and_parens <- tests/test_mark_tokens.py PASSED [ 9%] tests/test_astroid.py::TestAstroid::test_comprehensions <- tests/test_mark_tokens.py PASSED [ 10%] tests/test_astroid.py::TestAstroid::test_conditional_expr <- tests/test_mark_tokens.py PASSED [ 11%] tests/test_astroid.py::TestAstroid::test_decorators <- tests/test_mark_tokens.py FAILED [ 12%] tests/test_astroid.py::TestAstroid::test_deep_recursion <- tests/test_mark_tokens.py SKIPPED [ 13%] tests/test_astroid.py::TestAstroid::test_del_dict <- tests/test_mark_tokens.py PASSED [ 14%] tests/test_astroid.py::TestAstroid::test_dict_merge <- tests/test_mark_tokens.py PASSED [ 15%] tests/test_astroid.py::TestAstroid::test_dict_order <- tests/test_mark_tokens.py PASSED [ 16%] tests/test_astroid.py::TestAstroid::test_fixture1 <- tests/test_mark_tokens.py PASSED [ 16%] tests/test_astroid.py::TestAstroid::test_fixture10 <- tests/test_mark_tokens.py FAILED [ 17%] tests/test_astroid.py::TestAstroid::test_fixture11 <- tests/test_mark_tokens.py FAILED [ 18%] tests/test_astroid.py::TestAstroid::test_fixture12 <- tests/test_mark_tokens.py PASSED [ 19%] tests/test_astroid.py::TestAstroid::test_fixture13 <- tests/test_mark_tokens.py FAILED [ 20%] tests/test_astroid.py::TestAstroid::test_fixture2 <- tests/test_mark_tokens.py PASSED [ 21%] tests/test_astroid.py::TestAstroid::test_fixture3 <- tests/test_mark_tokens.py FAILED [ 22%] tests/test_astroid.py::TestAstroid::test_fixture4 <- tests/test_mark_tokens.py FAILED [ 23%] tests/test_astroid.py::TestAstroid::test_fixture5 <- tests/test_mark_tokens.py FAILED [ 24%] tests/test_astroid.py::TestAstroid::test_fixture6 <- tests/test_mark_tokens.py PASSED [ 25%] tests/test_astroid.py::TestAstroid::test_fixture7 <- tests/test_mark_tokens.py FAILED [ 26%] tests/test_astroid.py::TestAstroid::test_fixture8 <- tests/test_mark_tokens.py FAILED [ 27%] tests/test_astroid.py::TestAstroid::test_fixture9 <- tests/test_mark_tokens.py FAILED [ 28%] tests/test_astroid.py::TestAstroid::test_fstrings <- tests/test_mark_tokens.py FAILED [ 29%] tests/test_astroid.py::TestAstroid::test_keyword_arg_only <- tests/test_mark_tokens.py PASSED [ 30%] tests/test_astroid.py::TestAstroid::test_mark_tokens_multiline <- tests/test_mark_tokens.py PASSED [ 31%] tests/test_astroid.py::TestAstroid::test_mark_tokens_simple <- tests/test_mark_tokens.py FAILED [ 32%] tests/test_astroid.py::TestAstroid::test_nonascii <- tests/test_mark_tokens.py PASSED [ 33%] tests/test_astroid.py::TestAstroid::test_one_line_if_elif <- tests/test_mark_tokens.py PASSED [ 33%] tests/test_astroid.py::TestAstroid::test_paren_attr <- tests/test_mark_tokens.py PASSED [ 34%] tests/test_astroid.py::TestAstroid::test_parens_around_func <- tests/test_mark_tokens.py PASSED [ 35%] tests/test_astroid.py::TestAstroid::test_print_function <- tests/test_mark_tokens.py FAILED [ 36%] tests/test_astroid.py::TestAstroid::test_return_annotation <- tests/test_mark_tokens.py PASSED [ 37%] tests/test_astroid.py::TestAstroid::test_splat <- tests/test_mark_tokens.py FAILED [ 38%] tests/test_astroid.py::TestAstroid::test_statements_with_semicolons <- tests/test_mark_tokens.py PASSED [ 39%] tests/test_astroid.py::TestAstroid::test_sys_modules <- tests/test_mark_tokens.py FAILED [ 40%] tests/test_astroid.py::TestAstroid::test_trailing_commas <- tests/test_mark_tokens.py PASSED [ 41%] tests/test_astroid.py::TestAstroid::test_tuples <- tests/test_mark_tokens.py FAILED [ 42%] tests/test_astroid.py::TestAstroid::test_with <- tests/test_mark_tokens.py PASSED [ 43%] tests/test_asttokens.py::TestASTTokens::test_coding_declaration PASSED [ 44%] tests/test_asttokens.py::TestASTTokens::test_token_methods PASSED [ 45%] tests/test_asttokens.py::TestASTTokens::test_tokenizing PASSED [ 46%] tests/test_asttokens.py::TestASTTokens::test_unicode_offsets PASSED [ 47%] tests/test_asttokens.py::test_filename PASSED [ 48%] tests/test_asttokens.py::test_doesnt_have_location PASSED [ 49%] tests/test_line_numbers.py::TestLineNumbers::test_line_numbers PASSED [ 50%] tests/test_line_numbers.py::TestLineNumbers::test_unicode PASSED [ 50%] tests/test_line_numbers.py::TestLineNumbers::test_utf8_offsets PASSED [ 51%] tests/test_mark_tokens.py::TestMarkTokens::test_adjacent_joined_strings PASSED [ 52%] tests/test_mark_tokens.py::TestMarkTokens::test_adjacent_strings PASSED [ 53%] tests/test_mark_tokens.py::TestMarkTokens::test_assert_nodes_equal PASSED [ 54%] tests/test_mark_tokens.py::TestMarkTokens::test_assignment_expressions PASSED [ 55%] tests/test_mark_tokens.py::TestMarkTokens::test_async_def PASSED [ 56%] tests/test_mark_tokens.py::TestMarkTokens::test_async_for_and_with PASSED [ 57%] tests/test_mark_tokens.py::TestMarkTokens::test_await PASSED [ 58%] tests/test_mark_tokens.py::TestMarkTokens::test_calling_lambdas PASSED [ 59%] tests/test_mark_tokens.py::TestMarkTokens::test_complex_numbers PASSED [ 60%] tests/test_mark_tokens.py::TestMarkTokens::test_complex_slice_and_parens PASSED [ 61%] tests/test_mark_tokens.py::TestMarkTokens::test_comprehensions PASSED [ 62%] tests/test_mark_tokens.py::TestMarkTokens::test_conditional_expr PASSED [ 63%] tests/test_mark_tokens.py::TestMarkTokens::test_decorators PASSED [ 64%] tests/test_mark_tokens.py::TestMarkTokens::test_deep_recursion PASSED [ 65%] tests/test_mark_tokens.py::TestMarkTokens::test_del_dict PASSED [ 66%] tests/test_mark_tokens.py::TestMarkTokens::test_dict_merge PASSED [ 66%] tests/test_mark_tokens.py::TestMarkTokens::test_dict_order PASSED [ 67%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture1 PASSED [ 68%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture10 PASSED [ 69%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture11 PASSED [ 70%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture12 PASSED [ 71%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture13 PASSED [ 72%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture2 PASSED [ 73%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture3 PASSED [ 74%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture4 PASSED [ 75%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture5 PASSED [ 76%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture6 PASSED [ 77%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture7 PASSED [ 78%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture8 PASSED [ 79%] tests/test_mark_tokens.py::TestMarkTokens::test_fixture9 PASSED [ 80%] tests/test_mark_tokens.py::TestMarkTokens::test_fstrings PASSED [ 81%] tests/test_mark_tokens.py::TestMarkTokens::test_keyword_arg_only PASSED [ 82%] tests/test_mark_tokens.py::TestMarkTokens::test_mark_tokens_multiline PASSED [ 83%] tests/test_mark_tokens.py::TestMarkTokens::test_mark_tokens_simple PASSED [ 83%] tests/test_mark_tokens.py::TestMarkTokens::test_nonascii PASSED [ 84%] tests/test_mark_tokens.py::TestMarkTokens::test_one_line_if_elif PASSED [ 85%] tests/test_mark_tokens.py::TestMarkTokens::test_paren_attr PASSED [ 86%] tests/test_mark_tokens.py::TestMarkTokens::test_parens_around_func PASSED [ 87%] tests/test_mark_tokens.py::TestMarkTokens::test_print_function PASSED [ 88%] tests/test_mark_tokens.py::TestMarkTokens::test_return_annotation PASSED [ 89%] tests/test_mark_tokens.py::TestMarkTokens::test_slices PASSED [ 90%] tests/test_mark_tokens.py::TestMarkTokens::test_splat PASSED [ 91%] tests/test_mark_tokens.py::TestMarkTokens::test_statements_with_semicolons PASSED [ 92%] tests/test_mark_tokens.py::TestMarkTokens::test_sys_modules PASSED [ 93%] tests/test_mark_tokens.py::TestMarkTokens::test_trailing_commas PASSED [ 94%] tests/test_mark_tokens.py::TestMarkTokens::test_tuples PASSED [ 95%] tests/test_mark_tokens.py::TestMarkTokens::test_with PASSED [ 96%] tests/test_util.py::TestUtil::test_replace PASSED [ 97%] tests/test_util.py::TestUtil::test_walk_ast PASSED [ 98%] tests/test_util.py::TestUtil::test_walk_astroid PASSED [ 99%] tests/test_util.py::test_expect_token PASSED [100%] =================================== FAILURES =================================== ___________________ TestAstroid.test_assignment_expressions ____________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 63 text = 'def foo(answer=(p := 42)): # Valid, though not great style\n ...' tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'foo'), ('doc', None), ...] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'foo'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'foo'), ('doc', None), ('position', Position(lineno=21, col_offset=0, end_lineno=21, end_col_offset=7))] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'foo'), ('doc', None), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7))] vc1 = ('position', Position(lineno=21, col_offset=0, end_lineno=21, end_col_offset=7)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=21, col_offset=0, end_lineno=21, end_col_offset=7)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=21, col_offset=0, end_lineno=21, end_col_offset=7)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7)) vc1 = Position(lineno=21, col_offset=0, end_lineno=21, end_col_offset=7) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=21, col_offset=0, end_lineno=21, end_col_offset=7) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=21, col_offset=0, end_lineno=21, end_col_offset=7) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7) vc1 = 21 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = 21, t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 21 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = first = 21, second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 21 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = first = 21, second = 2, msg = '21 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 21 != 2 first = 21 msg = '21 != 2' second = 2 self = standardMsg = '21 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = def test_assignment_expressions(self): # From https://www.python.org/dev/peps/pep-0572/ > self.create_mark_checker(""" # Handle a matched regex if (match := pattern.search(data)) is not None: # Do something with match pass # A loop that can't be trivially rewritten using 2-arg iter() while chunk := file.read(8192): process(chunk) # Reuse a value that's expensive to compute [y := f(x), y**2, y**3] # Share a subexpression between a comprehension filter clause and its output filtered_data = [y for x in data if (y := f(x)) is not None] y0 = (y1 := f(x)) # Valid, though discouraged foo(x=(y := f(x))) # Valid, though probably confusing def foo(answer=(p := 42)): # Valid, though not great style ... def foo(answer: (p := 42) = 5): # Valid, but probably never useful ... lambda: (x := 1) # Valid, but unlikely to be useful (x := lambda: 1) # Valid lambda line: (m := re.match(pattern, line)) and m.group(1) # Valid if any((comment := line).startswith('#') for line in lines): print("First comment:", comment) if all((nonblank := line).strip() == '' for line in lines): print("All lines are blank") partial_sums = [total := total + v for v in values] """) self = tests/test_mark_tokens.py:678: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:43: in create_mark_checker checker.verify_all_nodes(self) atok = checker = node = self = source = ('\n' '# Handle a matched regex\n' 'if (match := pattern.search(data)) is not None:\n' ' # Do something with match\n' ' pass\n' '\n' "# A loop that can't be trivially rewritten using 2-arg iter()\n" 'while chunk := file.read(8192):\n' ' process(chunk)\n' '\n' "# Reuse a value that's expensive to compute\n" '[y := f(x), y**2, y**3]\n' '\n' '# Share a subexpression between a comprehension filter clause and its ' 'output\n' 'filtered_data = [y for x in data if (y := f(x)) is not None]\n' '\n' 'y0 = (y1 := f(x)) # Valid, though discouraged\n' '\n' 'foo(x=(y := f(x))) # Valid, though probably confusing\n' '\n' 'def foo(answer=(p := 42)): # Valid, though not great style\n' ' ...\n' '\n' 'def foo(answer: (p := 42) = 5): # Valid, but probably never useful\n' ' ...\n' '\n' 'lambda: (x := 1) # Valid, but unlikely to be useful\n' '\n' '(x := lambda: 1) # Valid\n' '\n' 'lambda line: (m := re.match(pattern, line)) and m.group(1) # Valid\n' '\n' "if any((comment := line).startswith('#') for line in lines):\n" ' print("First comment:", comment)\n' '\n' "if all((nonblank := line).strip() == '' for line in lines):\n" ' print("All lines are blank")\n' '\n' 'partial_sums = [total := total + v for v in values]\n') verify = True tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Func[57 chars] 21,\n 0,\n 21,\n 7],\n de[680 chars]))])" != "Func[57 chars] 2,\n 0,\n 2,\n 7],\n deco[678 chars]))])" E FunctionDef( E name='foo', E doc=None, E position=[ E - 21, E ? - E + 2, E 0, E - 21, E ? - E + 2, E 7], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[Name(name='answer')], E defaults=[NamedExpr( E target=Name(name='p'), E value=Const( E value=42, E kind=None))], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[Expr(value=Const( E value=Ellipsis, E kind=None))]) node = rebuilt_node = self = test_case = tested_nodes = 63 text = 'def foo(answer=(p := 42)): # Valid, though not great style\n ...' __________________________ TestAstroid.test_async_def __________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 3 text = '@decorator\nasync def foo():\n pass' tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', ), ('args', ), ('returns', None), ('body', []), ('name', 'foo'), ('doc', None), ...] t2 = [('decorators', ), ('args', ), ('returns', None), ('body', []), ('name', 'foo'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', ), ('args', ), ('returns', None), ('body', []), ('name', 'foo'), ('doc', None), ('position', Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=13))] t2 = [('decorators', ), ('args', ), ('returns', None), ('body', []), ('name', 'foo'), ('doc', None), ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13))] vc1 = ('position', Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=13)) vc2 = ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=13)) t2 = ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=13)) t2 = ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13)) vc1 = Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=13) vc2 = Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=13) t2 = Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=13) t2 = Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13) vc1 = 6 vc2 = 3 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 6 t2 = 3 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 6 t2 = 3 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 6 second = 3, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 6 msg = None second = 3 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 6 second = 3, msg = '6 != 3' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 6 != 3 first = 6 msg = '6 != 3' second = 3 self = standardMsg = '6 != 3' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = def test_async_def(self): > self.create_mark_checker(""" async def foo(): pass @decorator async def foo(): pass """) self = tests/test_mark_tokens.py:646: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:43: in create_mark_checker checker.verify_all_nodes(self) atok = checker = node = self = source = '\nasync def foo():\n pass\n\n@decorator\nasync def foo():\n pass\n' verify = True tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Asyn[61 chars] 6,\n 0,\n 6,\n 13],\n de[500 chars]()])" != "Asyn[61 chars] 3,\n 0,\n 3,\n 13],\n de[500 chars]()])" E AsyncFunctionDef( E name='foo', E doc=None, E position=[ E - 6, E ? ^ E + 3, E ? ^ E 0, E - 6, E ? ^ E + 3, E ? ^ E 13], E decorators=Decorators(nodes=[Name(name='decorator')]), E args=Arguments( E vararg=None, E kwarg=None, E args=[], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[Pass()]) node = rebuilt_node = self = test_case = tested_nodes = 3 text = '@decorator\nasync def foo():\n pass' _________________________ TestAstroid.test_decorators __________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 4 text = '@deco2(a=1)\ndef g(x):\n pass' tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', ), ('args', ), ('returns', None), ('body', []), ('name', 'g'), ('doc', None), ...] t2 = [('decorators', ), ('args', ), ('returns', None), ('body', []), ('name', 'g'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', ), ('args', ), ('returns', None), ('body', []), ('name', 'g'), ('doc', None), ('position', Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=5))] t2 = [('decorators', ), ('args', ), ('returns', None), ('body', []), ('name', 'g'), ('doc', None), ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=5))] vc1 = ('position', Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=5)) vc2 = ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=5)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=5)) t2 = ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=5)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=5)) t2 = ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=5)) vc1 = Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=5) vc2 = Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=5) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=5) t2 = Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=5) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=6, col_offset=0, end_lineno=6, end_col_offset=5) t2 = Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=5) vc1 = 6 vc2 = 3 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 6 t2 = 3 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 6 t2 = 3 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 6 second = 3, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 6 msg = None second = 3 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 6 second = 3, msg = '6 != 3' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 6 != 3 first = 6 msg = '6 != 3' second = 3 self = standardMsg = '6 != 3' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = def test_decorators(self): # See https://bitbucket.org/plas/thonny/issues/49/range-marker-fails-with-decorators source = textwrap.dedent(""" @deco1 def f(): pass @deco2(a=1) def g(x): pass @deco3() def g(x): pass """) > m = self.create_mark_checker(source) self = source = ('\n' '@deco1\n' 'def f():\n' ' pass\n' '@deco2(a=1)\n' 'def g(x):\n' ' pass\n' '\n' '@deco3()\n' 'def g(x):\n' ' pass\n') tests/test_mark_tokens.py:491: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:43: in create_mark_checker checker.verify_all_nodes(self) atok = checker = node = self = source = ('\n' '@deco1\n' 'def f():\n' ' pass\n' '@deco2(a=1)\n' 'def g(x):\n' ' pass\n' '\n' '@deco3()\n' 'def g(x):\n' ' pass\n') verify = True tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Func[54 chars] 6,\n 0,\n 6,\n 5],\n dec[723 chars]()])" != "Func[54 chars] 3,\n 0,\n 3,\n 5],\n dec[723 chars]()])" E FunctionDef( E name='g', E doc=None, E position=[ E - 6, E ? ^ E + 3, E ? ^ E 0, E - 6, E ? ^ E + 3, E ? ^ E 5], E decorators=Decorators(nodes=[Call( E func=Name(name='deco2'), E args=[], E keywords=[Keyword( E arg='a', E value=Const( E value=1, E kind=None))])]), E args=Arguments( E vararg=None, E kwarg=None, E args=[Name(name='x')], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[Pass()]) node = rebuilt_node = self = test_case = tested_nodes = 4 text = '@deco2(a=1)\ndef g(x):\n pass' __________________________ TestAstroid.test_fixture10 __________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 2 text = ('class TestCase(unittest.TestCase):\n' '\n' ' def setUp(self):\n' ' unittest.TestCase.setUp(self)\n' '\n' '\n' ' def tearDown(self):\n' ' unittest.TestCase.tearDown(self)\n' '\n' ' def testIt(self):\n' ' self.a = 10\n' ' self.xxx()\n' '\n' '\n' ' def xxx(self):\n' ' if False:\n' ' pass\n' " print('a')\n" '\n' ' if False:\n' ' pass\n' ' pass\n' '\n' ' if False:\n' ' pass\n' " print('rara')") tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('bases', []), ('keywords', []), ('body', [, ]), ('name', 'TestCase'), ('doc', None), ...] t2 = [('decorators', None), ('bases', []), ('keywords', []), ('body', [, ]), ('name', 'TestCase'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('bases', []), ('keywords', []), ('body', [, , , ]), ('name', 'TestCase'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=4, col_offset=0, end_lineno=4, end_col_offset=14))] t2 = [('decorators', None), ('bases', []), ('keywords', []), ('body', [, , , ]), ('name', 'TestCase'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=14))] vc1 = ('body', [, , , ]) vc2 = ('body', [, , , ]) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('body', [, , , ]) t2 = ('body', [, , , ]) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('body', [, , , ]) t2 = ('body', [, , , ]) vc1 = [, , , ] vc2 = [, , , ] tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [, , , ] t2 = [, , , ] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [, , , ] t2 = [, , , ] vc1 = vc2 = tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'setUp'), ('doc', None), ...] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'setUp'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'setUp'), ('doc', None), ('position', Position(lineno=6, col_offset=4, end_lineno=6, end_col_offset=13))] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'setUp'), ('doc', None), ('position', Position(lineno=4, col_offset=4, end_lineno=4, end_col_offset=13))] vc1 = ('position', Position(lineno=6, col_offset=4, end_lineno=6, end_col_offset=13)) vc2 = ('position', Position(lineno=4, col_offset=4, end_lineno=4, end_col_offset=13)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=6, col_offset=4, end_lineno=6, end_col_offset=13)) t2 = ('position', Position(lineno=4, col_offset=4, end_lineno=4, end_col_offset=13)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=6, col_offset=4, end_lineno=6, end_col_offset=13)) t2 = ('position', Position(lineno=4, col_offset=4, end_lineno=4, end_col_offset=13)) vc1 = Position(lineno=6, col_offset=4, end_lineno=6, end_col_offset=13) vc2 = Position(lineno=4, col_offset=4, end_lineno=4, end_col_offset=13) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=6, col_offset=4, end_lineno=6, end_col_offset=13) t2 = Position(lineno=4, col_offset=4, end_lineno=4, end_col_offset=13) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=6, col_offset=4, end_lineno=6, end_col_offset=13) t2 = Position(lineno=4, col_offset=4, end_lineno=4, end_col_offset=13) vc1 = 6 vc2 = 4 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 6 t2 = 4 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 6 t2 = 4 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 6 second = 4, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 6 msg = None second = 4 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 6 second = 4, msg = '6 != 4' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 6 != 4 first = 6 msg = '6 != 4' second = 4 self = standardMsg = '6 != 4' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = > def test_fixture10(self): self.verify_fixture_file('astroid/noendingnewline.py') self = tests/test_mark_tokens.py:173: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:134: in verify_fixture_file tested_nodes = m.verify_all_nodes(self) m = path = 'astroid/noendingnewline.py' self = source = ('import unittest\n' '\n' '\n' 'class TestCase(unittest.TestCase):\n' '\n' ' def setUp(self):\n' ' unittest.TestCase.setUp(self)\n' '\n' '\n' ' def tearDown(self):\n' ' unittest.TestCase.tearDown(self)\n' '\n' ' def testIt(self):\n' ' self.a = 10\n' ' self.xxx()\n' '\n' '\n' ' def xxx(self):\n' ' if False:\n' ' pass\n' " print('a')\n" '\n' ' if False:\n' ' pass\n' ' pass\n' '\n' ' if False:\n' ' pass\n' " print('rara')\n" '\n' '\n' "if __name__ == '__main__':\n" " print('test2')\n" ' unittest.main()\n' '\n' '\n') tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Clas[82 chars] 4,\n 0,\n 4,\n 14],\n de[5192 chars]])])" != "Clas[82 chars] 2,\n 0,\n 2,\n 14],\n de[5190 chars]])])" E ClassDef( E name='TestCase', E doc=None, E is_dataclass=False, E position=[ E - 4, E ? ^ E + 2, E ? ^ E 0, E - 4, E ? ^ E + 2, E ? ^ E 14], E decorators=None, E bases=[Attribute( E attrname='TestCase', E expr=Name(name='unittest'))], E keywords=[], E body=[ E FunctionDef( E name='setUp', E doc=None, E position=[ E - 6, E 4, E - 6, E ? ^ E + 4, E ? ^ E + 4, E 13], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[Name(name='self')], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[Expr(value=Call( E func=Attribute( E attrname='setUp', E expr=Attribute( E attrname='TestCase', E expr=Name(name='unittest'))), E args=[Name(name='self')], E keywords=[]))]), E FunctionDef( E name='tearDown', E doc=None, E position=[ E - 10, E ? ^^ E + 8, E ? ^ E 4, E - 10, E ? ^^ E + 8, E ? ^ E 16], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[Name(name='self')], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[Expr(value=Call( E func=Attribute( E attrname='tearDown', E expr=Attribute( E attrname='TestCase', E expr=Name(name='unittest'))), E args=[Name(name='self')], E keywords=[]))]), E FunctionDef( E name='testIt', E doc=None, E position=[ E - 13, E ? ^ E + 11, E ? ^ E 4, E - 13, E ? ^ E + 11, E ? ^ E 14], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[Name(name='self')], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[Assign( E targets=[Attribute( E attrname='a', E expr=Name(name='self'))], E value=Const( E value=10, E kind=None)), E Expr(value=Call( E func=Attribute( E attrname='xxx', E expr=Name(name='self')), E args=[], E keywords=[]))]), E FunctionDef( E name='xxx', E doc=None, E position=[ E - 18, E ? ^ E + 16, E ? ^ E 4, E - 18, E ? ^ E + 16, E ? ^ E 11], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[Name(name='self')], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[ E If( E test=Const( E value=False, E kind=None), E body=[Pass(), Expr(value=Call( E func=Name(name='print'), E args=[Const( E value='a', E kind=None)], E keywords=[]))], E orelse=[]), E If( E test=Const( E value=False, E kind=None), E body=[Pass(), Pass()], E orelse=[]), E If( E test=Const( E value=False, E kind=None), E body=[Pass(), Expr(value=Call( E func=Name(name='print'), E args=[Const( E value='rara', E kind=None)], E keywords=[]))], E orelse=[])])]) node = rebuilt_node = self = test_case = tested_nodes = 2 text = ('class TestCase(unittest.TestCase):\n' '\n' ' def setUp(self):\n' ' unittest.TestCase.setUp(self)\n' '\n' '\n' ' def tearDown(self):\n' ' unittest.TestCase.tearDown(self)\n' '\n' ' def testIt(self):\n' ' self.a = 10\n' ' self.xxx()\n' '\n' '\n' ' def xxx(self):\n' ' if False:\n' ' pass\n' " print('a')\n" '\n' ' if False:\n' ' pass\n' ' pass\n' '\n' ' if False:\n' ' pass\n' " print('rara')") __________________________ TestAstroid.test_fixture11 __________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 10 text = 'class Aaa: pass' tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'Aaa'), ('doc', None), ...] t2 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'Aaa'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'Aaa'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9))] t2 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'Aaa'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9))] vc1 = ('position', Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9)) vc1 = Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9) vc1 = 5 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 5 t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 5 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 5 second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 5 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 5 second = 2, msg = '5 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 5 != 2 first = 5 msg = '5 != 2' second = 2 self = standardMsg = '5 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = > def test_fixture11(self): self.verify_fixture_file('astroid/notall.py') self = tests/test_mark_tokens.py:174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:134: in verify_fixture_file tested_nodes = m.verify_all_nodes(self) m = path = 'astroid/notall.py' self = source = ('\n' "name = 'a'\n" '_bla = 2\n' "other = 'o'\n" 'class Aaa: pass\n' '\n' "def func(): print('yo')\n" '\n') tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Clas[77 chars] 5,\n 0,\n 5,\n 9],\n dec[59 chars]()])" != "Clas[77 chars] 2,\n 0,\n 2,\n 9],\n dec[59 chars]()])" E ClassDef( E name='Aaa', E doc=None, E is_dataclass=False, E position=[ E - 5, E ? ^ E + 2, E ? ^ E 0, E - 5, E ? ^ E + 2, E ? ^ E 9], E decorators=None, E bases=[], E keywords=[], E body=[Pass()]) node = rebuilt_node = self = test_case = tested_nodes = 10 text = 'class Aaa: pass' __________________________ TestAstroid.test_fixture13 __________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 1 text = 'class NotImplemented(Exception):\n pass' tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'NotImplemented'), ('doc', None), ...] t2 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'NotImplemented'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'NotImplemented'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=20))] t2 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'NotImplemented'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20))] vc1 = ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=20)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=20)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=20)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20)) vc1 = Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=20) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=20) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=20) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20) vc1 = 3 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 3 t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 3 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 3 second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 3 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 3 second = 2, msg = '3 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 3 != 2 first = 3 msg = '3 != 2' second = 2 self = standardMsg = '3 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = > def test_fixture13(self): self.verify_fixture_file('astroid/suppliermodule_test.py') self = tests/test_mark_tokens.py:176: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:134: in verify_fixture_file tested_nodes = m.verify_all_nodes(self) m = path = 'astroid/suppliermodule_test.py' self = source = ('""" file suppliermodule.py """\n' '\n' 'class NotImplemented(Exception):\n' ' pass\n' '\n' 'class Interface:\n' ' def get_value(self):\n' ' raise NotImplemented()\n' '\n' ' def set_value(self, value):\n' ' raise NotImplemented()\n' '\n' 'class DoNothing : pass\n') tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Clas[88 chars] 3,\n 0,\n 3,\n 20],\n de[82 chars]()])" != "Clas[88 chars] 2,\n 0,\n 2,\n 20],\n de[82 chars]()])" E ClassDef( E name='NotImplemented', E doc=None, E is_dataclass=False, E position=[ E - 3, E ? ^ E + 2, E ? ^ E 0, E - 3, E ? ^ E + 2, E ? ^ E 20], E decorators=None, E bases=[Name(name='Exception')], E keywords=[], E body=[Pass()]) node = rebuilt_node = self = test_case = tested_nodes = 1 text = 'class NotImplemented(Exception):\n pass' __________________________ TestAstroid.test_fixture3 ___________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 10 text = 'class Aaa: pass' tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'Aaa'), ('doc', None), ...] t2 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'Aaa'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'Aaa'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9))] t2 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'Aaa'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9))] vc1 = ('position', Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9)) vc1 = Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=9) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9) vc1 = 5 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 5, t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 5 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 5 second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 5 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 5 second = 2, msg = '5 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 5 != 2 first = 5 msg = '5 != 2' second = 2 self = standardMsg = '5 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = > def test_fixture3(self): self.verify_fixture_file('astroid/all.py') self = tests/test_mark_tokens.py:166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:134: in verify_fixture_file tested_nodes = m.verify_all_nodes(self) m = path = 'astroid/all.py' self = source = ('\n' "name = 'a'\n" '_bla = 2\n' "other = 'o'\n" 'class Aaa: pass\n' '\n' "def func(): print('yo')\n" '\n' "__all__ = 'Aaa', '_bla', 'name'\n") tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Clas[77 chars] 5,\n 0,\n 5,\n 9],\n dec[59 chars]()])" != "Clas[77 chars] 2,\n 0,\n 2,\n 9],\n dec[59 chars]()])" E ClassDef( E name='Aaa', E doc=None, E is_dataclass=False, E position=[ E - 5, E ? ^ E + 2, E ? ^ E 0, E - 5, E ? ^ E + 2, E ? ^ E 9], E decorators=None, E bases=[], E keywords=[], E body=[Pass()]) node = rebuilt_node = self = test_case = tested_nodes = 10 text = 'class Aaa: pass' __________________________ TestAstroid.test_fixture4 ___________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 2 text = 'class Toto: pass' tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'Toto'), ('doc', None), ...] t2 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'Toto'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'Toto'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=4, col_offset=0, end_lineno=4, end_col_offset=10))] t2 = [('decorators', None), ('bases', []), ('keywords', []), ('body', []), ('name', 'Toto'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=10))] vc1 = ('position', Position(lineno=4, col_offset=0, end_lineno=4, end_col_offset=10)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=10)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=4, col_offset=0, end_lineno=4, end_col_offset=10)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=10)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=4, col_offset=0, end_lineno=4, end_col_offset=10)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=10)) vc1 = Position(lineno=4, col_offset=0, end_lineno=4, end_col_offset=10) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=10) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=4, col_offset=0, end_lineno=4, end_col_offset=10) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=10) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=4, col_offset=0, end_lineno=4, end_col_offset=10) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=10) vc1 = 4 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 4, t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 4 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 4 second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 4 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 4 second = 2, msg = '4 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 4 != 2 first = 4 msg = '4 != 2' second = 2 self = standardMsg = '4 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = > def test_fixture4(self): self.verify_fixture_file('astroid/clientmodule_test.py') self = tests/test_mark_tokens.py:167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:134: in verify_fixture_file tested_nodes = m.verify_all_nodes(self) m = path = 'astroid/clientmodule_test.py' self = source = ('""" docstring for file clientmodule.py """\n' 'from data.suppliermodule_test import Interface as IFace, DoNothing\n' '\n' 'class Toto: pass\n' '\n' 'class Ancestor:\n' ' """ Ancestor method """\n' ' __implements__ = (IFace,)\n' '\n' ' def __init__(self, value):\n' ' local_variable = 0\n' " self.attr = 'this method shouldn\\'t have a docstring'\n" ' self.__value = value\n' '\n' ' def get_value(self):\n' ' """ nice docstring ;-) """\n' ' return self.__value\n' '\n' ' def set_value(self, value):\n' ' self.__value = value\n' " return 'this method shouldn\\'t have a docstring'\n" '\n' 'class Specialization(Ancestor):\n' " TYPE = 'final class'\n" " top = 'class'\n" '\n' ' def __init__(self, value, _id):\n' ' Ancestor.__init__(self, value)\n' ' self._id = _id\n' ' self.relation = DoNothing()\n' ' self.toto = Toto()\n' '\n') tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Clas[78 chars] 4,\n 0,\n 4,\n 10],\n de[60 chars]()])" != "Clas[78 chars] 2,\n 0,\n 2,\n 10],\n de[60 chars]()])" E ClassDef( E name='Toto', E doc=None, E is_dataclass=False, E position=[ E - 4, E ? ^ E + 2, E ? ^ E 0, E - 4, E ? ^ E + 2, E ? ^ E 10], E decorators=None, E bases=[], E keywords=[], E body=[Pass()]) node = rebuilt_node = self = test_case = tested_nodes = 2 text = 'class Toto: pass' __________________________ TestAstroid.test_fixture5 ___________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 2 text = ('class Page(object):\n' ' _urlOpen = staticmethod(urllib.urlopen)\n' '\n' ' def getPage(self, url):\n' ' handle = self._urlOpen(url)\n' ' data = handle.read()\n' ' handle.close()\n' ' return data') tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('bases', []), ('keywords', []), ('body', [, ]), ('name', 'Page'), ('doc', None), ...] t2 = [('decorators', None), ('bases', []), ('keywords', []), ('body', [, ]), ('name', 'Page'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('bases', []), ('keywords', []), ('body', [, ]), ('name', 'Page'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=4, col_offset=0, end_lineno=4, end_col_offset=10))] t2 = [('decorators', None), ('bases', []), ('keywords', []), ('body', [, ]), ('name', 'Page'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=10))] vc1 = ('body', [, ]) vc2 = ('body', [, ]) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('body', [, ]) t2 = ('body', [, ]) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('body', [, ]) t2 = ('body', [, ]) vc1 = [, ] vc2 = [, ] tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [, ] t2 = [, ] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [, ] t2 = [, ] vc1 = vc2 = tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', [, , ]), ('name', 'getPage'), ('doc', None), ...] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', [, , ]), ('name', 'getPage'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', [, , , ]), ('name', 'getPage'), ('doc', None), ('position', Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=15))] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', [, , , ]), ('name', 'getPage'), ('doc', None), ('position', Position(lineno=5, col_offset=4, end_lineno=5, end_col_offset=15))] vc1 = ('position', Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=15)) vc2 = ('position', Position(lineno=5, col_offset=4, end_lineno=5, end_col_offset=15)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=15)) t2 = ('position', Position(lineno=5, col_offset=4, end_lineno=5, end_col_offset=15)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=15)) t2 = ('position', Position(lineno=5, col_offset=4, end_lineno=5, end_col_offset=15)) vc1 = Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=15) vc2 = Position(lineno=5, col_offset=4, end_lineno=5, end_col_offset=15) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=15) t2 = Position(lineno=5, col_offset=4, end_lineno=5, end_col_offset=15) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=15) t2 = Position(lineno=5, col_offset=4, end_lineno=5, end_col_offset=15) vc1 = 7 vc2 = 5 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 7, t2 = 5 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 7 t2 = 5 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 7 second = 5, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 7 msg = None second = 5 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 7 second = 5, msg = '7 != 5' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 7 != 5 first = 7 msg = '7 != 5' second = 5 self = standardMsg = '7 != 5' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = > def test_fixture5(self): self.verify_fixture_file('astroid/descriptor_crash.py') self = tests/test_mark_tokens.py:168: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:134: in verify_fixture_file tested_nodes = m.verify_all_nodes(self) m = path = 'astroid/descriptor_crash.py' self = source = ('\n' 'import urllib\n' '\n' 'class Page(object):\n' ' _urlOpen = staticmethod(urllib.urlopen)\n' '\n' ' def getPage(self, url):\n' ' handle = self._urlOpen(url)\n' ' data = handle.read()\n' ' handle.close()\n' ' return data\n') tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Clas[78 chars] 4,\n 0,\n 4,\n 10],\n de[1962 chars]])])" != "Clas[78 chars] 2,\n 0,\n 2,\n 10],\n de[1962 chars]])])" E ClassDef( E name='Page', E doc=None, E is_dataclass=False, E position=[ E - 4, E ? ^ E + 2, E ? ^ E 0, E - 4, E ? ^ E + 2, E ? ^ E 10], E decorators=None, E bases=[Name(name='object')], E keywords=[], E body=[Assign( E targets=[Name(name='_urlOpen')], E value=Call( E func=Name(name='staticmethod'), E args=[Attribute( E attrname='urlopen', E expr=Name(name='urllib'))], E keywords=[])), E FunctionDef( E name='getPage', E doc=None, E position=[ E - 7, E ? ^ E + 5, E ? ^ E 4, E - 7, E ? ^ E + 5, E ? ^ E 15], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[Name(name='self'), Name(name='url')], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[None, None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[None, None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[ E Assign( E targets=[Name(name='handle')], E value=Call( E func=Attribute( E attrname='_urlOpen', E expr=Name(name='self')), E args=[Name(name='url')], E keywords=[])), E Assign( E targets=[Name(name='data')], E value=Call( E func=Attribute( E attrname='read', E expr=Name(name='handle')), E args=[], E keywords=[])), E Expr(value=Call( E func=Attribute( E attrname='close', E expr=Name(name='handle')), E args=[], E keywords=[])), E Return(value=Name(name='data'))])]) node = rebuilt_node = self = test_case = tested_nodes = 2 text = ('class Page(object):\n' ' _urlOpen = staticmethod(urllib.urlopen)\n' '\n' ' def getPage(self, url):\n' ' handle = self._urlOpen(url)\n' ' data = handle.read()\n' ' handle.close()\n' ' return data') __________________________ TestAstroid.test_fixture7 ___________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 24 text = 'def definition(a,\n b,\n c):\n return a + b + c' tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'definition'), ('doc', None), ...] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'definition'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'definition'), ('doc', None), ('position', Position(lineno=15, col_offset=0, end_lineno=15, end_col_offset=14))] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'definition'), ('doc', None), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=14))] vc1 = ('position', Position(lineno=15, col_offset=0, end_lineno=15, end_col_offset=14)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=14)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=15, col_offset=0, end_lineno=15, end_col_offset=14)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=14)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=15, col_offset=0, end_lineno=15, end_col_offset=14)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=14)) vc1 = Position(lineno=15, col_offset=0, end_lineno=15, end_col_offset=14) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=14) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=15, col_offset=0, end_lineno=15, end_col_offset=14) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=14) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=15, col_offset=0, end_lineno=15, end_col_offset=14) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=14) vc1 = 15 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 15 t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 15 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 15 second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 15 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 15 second = 2, msg = '15 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 15 != 2 first = 15 msg = '15 != 2' second = 2 self = standardMsg = '15 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = > def test_fixture7(self): self.verify_fixture_file('astroid/format.py') self = tests/test_mark_tokens.py:170: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:134: in verify_fixture_file tested_nodes = m.verify_all_nodes(self) m = path = 'astroid/format.py' self = source = ('"""A multiline string\n' '"""\n' '\n' "function('aeozrijz\\\n" "earzer', hop)\n" '# XXX write test\n' 'x = [i for i in range(5)\n' ' if i % 4]\n' '\n' 'fonction(1,\n' ' 2,\n' ' 3,\n' ' 4)\n' '\n' 'def definition(a,\n' ' b,\n' ' c):\n' ' return a + b + c\n' '\n' 'class debile(dict,\n' ' object):\n' ' pass\n' '\n' 'if aaaa: pass\n' 'else:\n' ' aaaa,bbbb = 1,2\n' ' aaaa,bbbb = bbbb,aaaa\n' '# XXX write test\n' 'hop = \\\n' ' aaaa\n' '\n' '\n' '__revision__.lower();\n' '\n') tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Func[63 chars] 15,\n 0,\n 15,\n 14],\n [830 chars]))])" != "Func[63 chars] 2,\n 0,\n 2,\n 14],\n de[828 chars]))])" E FunctionDef( E name='definition', E doc=None, E position=[ E - 15, E ? ^^ E + 2, E ? ^ E 0, E - 15, E ? ^^ E + 2, E ? ^ E 14], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[ E Name(name='a'), E Name(name='b'), E Name(name='c')], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[ E None, E None, E None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[ E None, E None, E None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[Return(value=BinOp( E op='+', E left=BinOp( E op='+', E left=Name(name='a'), E right=Name(name='b')), E right=Name(name='c')))]) node = rebuilt_node = self = test_case = tested_nodes = 24 text = 'def definition(a,\n b,\n c):\n return a + b + c' __________________________ TestAstroid.test_fixture8 ___________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 11 text = ('def global_access(key, val):\n' ' """function test"""\n' ' local = 1\n' ' MY_DICT[key] = val\n' ' for i in val:\n' ' if i:\n' ' del MY_DICT[i]\n' ' continue\n' ' else:\n' ' break\n' ' else:\n' ' return') tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', [, ]), ('name', 'global_access'), ('doc', 'function test'), ...] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', [, ]), ('name', 'global_access'), ('doc', 'function test'), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', [, , ]), ('name', 'global_access'), ('doc', 'function test'), ('position', Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17))] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', [, , ]), ('name', 'global_access'), ('doc', 'function test'), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17))] vc1 = ('position', Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17)) vc1 = Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17) vc1 = 11 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 11 t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 11 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 11 second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 11 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 11 second = 2, msg = '11 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 11 != 2 first = 11 msg = '11 != 2' second = 2 self = standardMsg = '11 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = > def test_fixture8(self): self.verify_fixture_file('astroid/module.py') self = tests/test_mark_tokens.py:171: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:134: in verify_fixture_file tested_nodes = m.verify_all_nodes(self) m = path = 'astroid/module.py' self = source = ('"""test module for astroid\n' '"""\n' '\n' "__revision__ = '$Id: module.py,v 1.2 2005-11-02 11:56:54 syt Exp $'\n" 'from astroid.node_classes import Name as NameNode\n' 'from astroid import modutils\n' 'from astroid.utils import *\n' 'import os.path\n' 'MY_DICT = {}\n' '\n' 'def global_access(key, val):\n' ' """function test"""\n' ' local = 1\n' ' MY_DICT[key] = val\n' ' for i in val:\n' ' if i:\n' ' del MY_DICT[i]\n' ' continue\n' ' else:\n' ' break\n' ' else:\n' ' return\n' '\n' '\n' 'class YO:\n' ' """hehe"""\n' ' a = 1\n' ' \n' ' def __init__(self):\n' ' try:\n' ' self.yo = 1\n' ' except ValueError as ex:\n' ' pass\n' ' except (NameError, TypeError):\n' ' raise XXXError()\n' ' except:\n' ' raise\n' '\n' '\n' '\n' 'class YOUPI(YO):\n' ' class_attr = None\n' ' \n' ' def __init__(self):\n' ' self.member = None\n' ' \n' ' def method(self):\n' ' """method test"""\n' ' global MY_DICT\n' ' try:\n' ' MY_DICT = {}\n' ' local = None\n' ' autre = [a for (a, b) in MY_DICT if b]\n' ' if b in autre:\n' ' return\n' ' else:\n' ' if a in autre:\n' " return 'hehe'\n" ' global_access(local, val=autre)\n' ' finally:\n' ' return local\n' ' \n' ' def static_method():\n' ' """static method test"""\n' " assert MY_DICT, '???'\n" ' static_method = staticmethod(static_method)\n' ' \n' ' def class_method(cls):\n' ' """class method test"""\n' ' exec(a, b)\n' ' class_method = classmethod(class_method)\n' '\n' '\n' 'def four_args(a, b, c, d):\n' ' """four arguments (was nested_args)"""\n' ' while 1:\n' ' if a:\n' ' break\n' ' a += +1\n' ' else:\n' ' b += -2\n' ' if c:\n' ' d = ((a) and (b)) or (c)\n' ' else:\n' ' c = ((a) and (b)) or (d)\n' ' list(map(lambda x, y: (y, x), a))\n' 'redirect = four_args\n' '\n') tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Func[77 chars] 11,\n 0,\n 11,\n 17],\n [1294 chars]])])" != "Func[77 chars] 2,\n 0,\n 2,\n 17],\n de[1292 chars]])])" E FunctionDef( E name='global_access', E doc='function test', E position=[ E - 11, E ? ^^ E + 2, E ? ^ E 0, E - 11, E ? ^^ E + 2, E ? ^ E 17], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[Name(name='key'), Name(name='val')], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[None, None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[None, None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[ E Assign( E targets=[Name(name='local')], E value=Const( E value=1, E kind=None)), E Assign( E targets=[Subscript( E ctx=, E value=Name(name='MY_DICT'), E slice=Name(name='key'))], E value=Name(name='val')), E For( E target=Name(name='i'), E iter=Name(name='val'), E body=[If( E test=Name(name='i'), E body=[Delete(targets=[Subscript( E ctx=, E value=Name(name='MY_DICT'), E slice=Name(name='i'))]), E Continue()], E orelse=[Break()])], E orelse=[Return(value=None)])]) node = rebuilt_node = self = test_case = tested_nodes = 11 text = ('def global_access(key, val):\n' ' """function test"""\n' ' local = 1\n' ' MY_DICT[key] = val\n' ' for i in val:\n' ' if i:\n' ' del MY_DICT[i]\n' ' continue\n' ' else:\n' ' break\n' ' else:\n' ' return') __________________________ TestAstroid.test_fixture9 ___________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 3 text = 'class Specialization(YOUPI, YO):\n pass' tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('bases', [, ]), ('keywords', []), ('body', []), ('name', 'Specialization'), ('doc', None), ...] t2 = [('decorators', None), ('bases', [, ]), ('keywords', []), ('body', []), ('name', 'Specialization'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('bases', [, ]), ('keywords', []), ('body', []), ('name', 'Specialization'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=20))] t2 = [('decorators', None), ('bases', [, ]), ('keywords', []), ('body', []), ('name', 'Specialization'), ('doc', None), ('is_dataclass', False), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20))] vc1 = ('position', Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=20)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=20)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=20)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20)) vc1 = Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=20) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=20) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=5, col_offset=0, end_lineno=5, end_col_offset=20) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=20) vc1 = 5 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 5, t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 5 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 5 second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 5 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 5 second = 2, msg = '5 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 5 != 2 first = 5 msg = '5 != 2' second = 2 self = standardMsg = '5 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = > def test_fixture9(self): self.verify_fixture_file('astroid/module2.py') self = tests/test_mark_tokens.py:172: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:134: in verify_fixture_file tested_nodes = m.verify_all_nodes(self) m = path = 'astroid/module2.py' self = source = ('from data.module import YO, YOUPI\n' 'import data\n' '\n' '\n' 'class Specialization(YOUPI, YO):\n' ' pass\n' '\n' '\n' '\n' 'class Metaclass(type):\n' ' pass\n' '\n' '\n' '\n' 'class Interface:\n' ' pass\n' '\n' '\n' '\n' 'class MyIFace(Interface):\n' ' pass\n' '\n' '\n' '\n' 'class AnotherIFace(Interface):\n' ' pass\n' '\n' '\n' '\n' 'class MyException(Exception):\n' ' pass\n' '\n' '\n' '\n' 'class MyError(MyException):\n' ' pass\n' '\n' '\n' '\n' 'class AbstractClass(object):\n' ' \n' ' def to_override(self, whatever):\n' ' raise NotImplementedError()\n' ' \n' ' def return_something(self, param):\n' ' if param:\n' " return 'toto'\n" ' return\n' '\n' '\n' '\n' 'class Concrete0:\n' ' __implements__ = MyIFace\n' '\n' '\n' '\n' 'class Concrete1:\n' ' __implements__ = (MyIFace, AnotherIFace)\n' '\n' '\n' '\n' 'class Concrete2:\n' ' __implements__ = (MyIFace, AnotherIFace)\n' '\n' '\n' '\n' 'class Concrete23(Concrete1):\n' ' pass\n' '\n' 'del YO.member\n' 'del YO\n' '[SYN1, SYN2] = (Concrete0, Concrete1)\n' 'assert repr(1)\n' 'b = (1) | (((2) & (3)) ^ (8))\n' 'bb = ((1) | (two)) | (6)\n' 'ccc = ((one) & (two)) & (three)\n' 'dddd = ((x) ^ (o)) ^ (r)\n' "exec('c = 3')\n" "exec('c = 3', {}, {})\n" '\n' 'def raise_string(a=2, *args, **kwargs):\n' " raise Exception('yo')\n" " yield 'coucou'\n" ' yield\n' 'a = (b) + (2)\n' 'c = (b) * (2)\n' 'c = (b) / (2)\n' 'c = (b) // (2)\n' 'c = (b) - (2)\n' 'c = (b) % (2)\n' 'c = (b) ** (2)\n' 'c = (b) << (2)\n' 'c = (b) >> (2)\n' 'c = ~b\n' 'c = not b\n' 'd = [c]\n' 'e = d[:]\n' 'e = d[a:b:c]\n' 'raise_string(*args, **kwargs)\n' "print('bonjour', file=stream)\n" "print('salut', end=' ', file=stream)\n" '\n' 'def make_class(any, base=data.module.YO, *args, **kwargs):\n' ' """check base is correctly resolved to Concrete0"""\n' ' \n' ' \n' ' class Aaaa(base):\n' ' """dynamic class"""\n' ' \n' ' \n' ' return Aaaa\n' 'from os.path import abspath\n' 'import os as myos\n' '\n' '\n' 'class A:\n' ' pass\n' '\n' '\n' '\n' 'class A(A):\n' ' pass\n' '\n' '\n' 'def generator():\n' ' """A generator."""\n' ' yield\n' '\n' 'def not_a_generator():\n' ' """A function that contains generator, but is not one."""\n' ' \n' ' def generator():\n' ' yield\n' ' genl = lambda : (yield)\n' '\n' 'def with_metaclass(meta, *bases):\n' " return meta('NewBase', bases, {})\n" '\n' '\n' 'class NotMetaclass(with_metaclass(Metaclass)):\n' ' pass\n' '\n' '\n') tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Clas[88 chars] 5,\n 0,\n 5,\n 20],\n de[95 chars]()])" != "Clas[88 chars] 2,\n 0,\n 2,\n 20],\n de[95 chars]()])" E ClassDef( E name='Specialization', E doc=None, E is_dataclass=False, E position=[ E - 5, E ? ^ E + 2, E ? ^ E 0, E - 5, E ? ^ E + 2, E ? ^ E 20], E decorators=None, E bases=[Name(name='YOUPI'), Name(name='YO')], E keywords=[], E body=[Pass()]) node = rebuilt_node = self = test_case = tested_nodes = 3 text = 'class Specialization(YOUPI, YO):\n pass' __________________________ TestAstroid.test_fstrings ___________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 1 text = "def t():\n return f'{function(kwarg=24)}'" tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 't'), ('doc', None), ...] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 't'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 't'), ('doc', None), ('position', Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=5))] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 't'), ('doc', None), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=5))] vc1 = ('position', Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=5)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=5)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=5)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=5)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=5)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=5)) vc1 = Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=5) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=5) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=5) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=5) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=5) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=5) vc1 = 1 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 1, t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 1 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 1 second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 1 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 1 second = 2, msg = '1 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 1 != 2 first = 1 msg = '1 != 2' second = 2 self = standardMsg = '1 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = def test_fstrings(self): for source in ( '(f"He said his name is {name!r}.",)', "f'{function(kwarg=24)}'", 'a = f"""result: {value:{width}.{precision}}"""', """[f"abc {a['x']} def"]""", "def t():\n return f'{function(kwarg=24)}'"): > self.create_mark_checker(source) self = source = "def t():\n return f'{function(kwarg=24)}'" tests/test_mark_tokens.py:306: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:43: in create_mark_checker checker.verify_all_nodes(self) atok = checker = node = self = source = "def t():\n return f'{function(kwarg=24)}'" verify = True tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Func[54 chars] 1,\n 0,\n 1,\n 5],\n dec[897 chars]))])" != "Func[54 chars] 2,\n 0,\n 2,\n 5],\n dec[897 chars]))])" E FunctionDef( E name='t', E doc=None, E position=[ E - 1, E ? ^ E + 2, E ? ^ E 0, E - 1, E ? ^ E + 2, E ? ^ E 5], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[Return(value=JoinedStr(values=[FormattedValue( E conversion=-1, E value=Call( E func=Name(name='function'), E args=[], E keywords=[Keyword( E arg='kwarg', E value=Const( E value=24, E kind=None))]), E format_spec=None)]))]) node = rebuilt_node = self = test_case = tested_nodes = 1 text = "def t():\n return f'{function(kwarg=24)}'" _____________________ TestAstroid.test_mark_tokens_simple ______________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 11 text = ('def global_access(key, val):\n' ' """function test"""\n' ' local = 1\n' ' MY_DICT[key] = val\n' ' for i in val:\n' ' if i:\n' ' del MY_DICT[i]\n' ' continue\n' ' else:\n' ' break\n' ' else:\n' ' return') tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', [, ]), ('name', 'global_access'), ('doc', 'function test'), ...] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', [, ]), ('name', 'global_access'), ('doc', 'function test'), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', [, , ]), ('name', 'global_access'), ('doc', 'function test'), ('position', Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17))] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', [, , ]), ('name', 'global_access'), ('doc', 'function test'), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17))] vc1 = ('position', Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17)) vc1 = Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=11, col_offset=0, end_lineno=11, end_col_offset=17) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=17) vc1 = 11 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = 11, t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 11 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = first = 11, second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 11 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = first = 11, second = 2, msg = '11 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 11 != 2 first = 11 msg = '11 != 2' second = 2 self = standardMsg = '11 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = def test_mark_tokens_simple(self): source = tools.read_fixture('astroid', 'module.py') > m = self.create_mark_checker(source) self = source = ('"""test module for astroid\n' '"""\n' '\n' "__revision__ = '$Id: module.py,v 1.2 2005-11-02 11:56:54 syt Exp $'\n" 'from astroid.node_classes import Name as NameNode\n' 'from astroid import modutils\n' 'from astroid.utils import *\n' 'import os.path\n' 'MY_DICT = {}\n' '\n' 'def global_access(key, val):\n' ' """function test"""\n' ' local = 1\n' ' MY_DICT[key] = val\n' ' for i in val:\n' ' if i:\n' ' del MY_DICT[i]\n' ' continue\n' ' else:\n' ' break\n' ' else:\n' ' return\n' '\n' '\n' 'class YO:\n' ' """hehe"""\n' ' a = 1\n' ' \n' ' def __init__(self):\n' ' try:\n' ' self.yo = 1\n' ' except ValueError as ex:\n' ' pass\n' ' except (NameError, TypeError):\n' ' raise XXXError()\n' ' except:\n' ' raise\n' '\n' '\n' '\n' 'class YOUPI(YO):\n' ' class_attr = None\n' ' \n' ' def __init__(self):\n' ' self.member = None\n' ' \n' ' def method(self):\n' ' """method test"""\n' ' global MY_DICT\n' ' try:\n' ' MY_DICT = {}\n' ' local = None\n' ' autre = [a for (a, b) in MY_DICT if b]\n' ' if b in autre:\n' ' return\n' ' else:\n' ' if a in autre:\n' " return 'hehe'\n" ' global_access(local, val=autre)\n' ' finally:\n' ' return local\n' ' \n' ' def static_method():\n' ' """static method test"""\n' " assert MY_DICT, '???'\n" ' static_method = staticmethod(static_method)\n' ' \n' ' def class_method(cls):\n' ' """class method test"""\n' ' exec(a, b)\n' ' class_method = classmethod(class_method)\n' '\n' '\n' 'def four_args(a, b, c, d):\n' ' """four arguments (was nested_args)"""\n' ' while 1:\n' ' if a:\n' ' break\n' ' a += +1\n' ' else:\n' ' b += -2\n' ' if c:\n' ' d = ((a) and (b)) or (c)\n' ' else:\n' ' c = ((a) and (b)) or (d)\n' ' list(map(lambda x, y: (y, x), a))\n' 'redirect = four_args\n' '\n') tests/test_mark_tokens.py:73: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:43: in create_mark_checker checker.verify_all_nodes(self) atok = checker = node = self = source = ('"""test module for astroid\n' '"""\n' '\n' "__revision__ = '$Id: module.py,v 1.2 2005-11-02 11:56:54 syt Exp $'\n" 'from astroid.node_classes import Name as NameNode\n' 'from astroid import modutils\n' 'from astroid.utils import *\n' 'import os.path\n' 'MY_DICT = {}\n' '\n' 'def global_access(key, val):\n' ' """function test"""\n' ' local = 1\n' ' MY_DICT[key] = val\n' ' for i in val:\n' ' if i:\n' ' del MY_DICT[i]\n' ' continue\n' ' else:\n' ' break\n' ' else:\n' ' return\n' '\n' '\n' 'class YO:\n' ' """hehe"""\n' ' a = 1\n' ' \n' ' def __init__(self):\n' ' try:\n' ' self.yo = 1\n' ' except ValueError as ex:\n' ' pass\n' ' except (NameError, TypeError):\n' ' raise XXXError()\n' ' except:\n' ' raise\n' '\n' '\n' '\n' 'class YOUPI(YO):\n' ' class_attr = None\n' ' \n' ' def __init__(self):\n' ' self.member = None\n' ' \n' ' def method(self):\n' ' """method test"""\n' ' global MY_DICT\n' ' try:\n' ' MY_DICT = {}\n' ' local = None\n' ' autre = [a for (a, b) in MY_DICT if b]\n' ' if b in autre:\n' ' return\n' ' else:\n' ' if a in autre:\n' " return 'hehe'\n" ' global_access(local, val=autre)\n' ' finally:\n' ' return local\n' ' \n' ' def static_method():\n' ' """static method test"""\n' " assert MY_DICT, '???'\n" ' static_method = staticmethod(static_method)\n' ' \n' ' def class_method(cls):\n' ' """class method test"""\n' ' exec(a, b)\n' ' class_method = classmethod(class_method)\n' '\n' '\n' 'def four_args(a, b, c, d):\n' ' """four arguments (was nested_args)"""\n' ' while 1:\n' ' if a:\n' ' break\n' ' a += +1\n' ' else:\n' ' b += -2\n' ' if c:\n' ' d = ((a) and (b)) or (c)\n' ' else:\n' ' c = ((a) and (b)) or (d)\n' ' list(map(lambda x, y: (y, x), a))\n' 'redirect = four_args\n' '\n') verify = True tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Func[77 chars] 11,\n 0,\n 11,\n 17],\n [1294 chars]])])" != "Func[77 chars] 2,\n 0,\n 2,\n 17],\n de[1292 chars]])])" E FunctionDef( E name='global_access', E doc='function test', E position=[ E - 11, E ? ^^ E + 2, E ? ^ E 0, E - 11, E ? ^^ E + 2, E ? ^ E 17], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[Name(name='key'), Name(name='val')], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[None, None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[None, None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[ E Assign( E targets=[Name(name='local')], E value=Const( E value=1, E kind=None)), E Assign( E targets=[Subscript( E ctx=, E value=Name(name='MY_DICT'), E slice=Name(name='key'))], E value=Name(name='val')), E For( E target=Name(name='i'), E iter=Name(name='val'), E body=[If( E test=Name(name='i'), E body=[Delete(targets=[Subscript( E ctx=, E value=Name(name='MY_DICT'), E slice=Name(name='i'))]), E Continue()], E orelse=[Break()])], E orelse=[Return(value=None)])]) node = rebuilt_node = self = test_case = tested_nodes = 11 text = ('def global_access(key, val):\n' ' """function test"""\n' ' local = 1\n' ' MY_DICT[key] = val\n' ' for i in val:\n' ' if i:\n' ' del MY_DICT[i]\n' ' continue\n' ' else:\n' ' break\n' ' else:\n' ' return') _______________________ TestAstroid.test_print_function ________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 5 text = (' def enumerate(iterable):\n' ' """emulates the python2.3 enumerate() function"""\n' ' i = 0\n' ' for val in iterable:\n' ' yield i, val\n' ' i += 1') tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', [, ]), ('name', 'enumerate'), ('doc', 'emulates the python2.3 enumerate() function'), ...] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', [, ]), ('name', 'enumerate'), ('doc', 'emulates the python2.3 enumerate() function'), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', [, ]), ('name', 'enumerate'), ('doc', 'emulates the python2.3 enumerate() function'), ('position', Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=17))] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', [, ]), ('name', 'enumerate'), ('doc', 'emulates the python2.3 enumerate() function'), ('position', Position(lineno=2, col_offset=4, end_lineno=2, end_col_offset=17))] vc1 = ('position', Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=17)) vc2 = ('position', Position(lineno=2, col_offset=4, end_lineno=2, end_col_offset=17)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=17)) t2 = ('position', Position(lineno=2, col_offset=4, end_lineno=2, end_col_offset=17)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=17)) t2 = ('position', Position(lineno=2, col_offset=4, end_lineno=2, end_col_offset=17)) vc1 = Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=17) vc2 = Position(lineno=2, col_offset=4, end_lineno=2, end_col_offset=17) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=17) t2 = Position(lineno=2, col_offset=4, end_lineno=2, end_col_offset=17) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=7, col_offset=4, end_lineno=7, end_col_offset=17) t2 = Position(lineno=2, col_offset=4, end_lineno=2, end_col_offset=17) vc1 = 7 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 7 t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 7 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = first = 7, second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 7 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = first = 7, second = 2, msg = '7 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 7 != 2 first = 7 msg = '7 != 2' second = 2 self = standardMsg = '7 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = def test_print_function(self): # This testcase imports print as function (using from __future__). Check that we can parse it. # verify_all_nodes doesn't work on Python 2 because the print() call parsed in isolation # is viewed as a Print node since it doesn't see the future import source = tools.read_fixture('astroid/nonregr.py') > m = self.create_mark_checker(source, verify=six.PY3) self = source = ('\n' '\n' 'try:\n' ' enumerate = enumerate\n' 'except NameError:\n' '\n' ' def enumerate(iterable):\n' ' """emulates the python2.3 enumerate() function"""\n' ' i = 0\n' ' for val in iterable:\n' ' yield i, val\n' ' i += 1\n' '\n' 'def toto(value):\n' ' for k, v in value:\n' " print(v.get('yo'))\n" '\n' '\n' 'import imp\n' "fp, mpath, desc = imp.find_module('optparse',a)\n" "s_opt = imp.load_module('std_optparse', fp, mpath, desc)\n" '\n' 'class OptionParser(s_opt.OptionParser):\n' '\n' ' def parse_args(self, args=None, values=None, real_optparse=False):\n' ' if real_optparse:\n' ' pass\n' '## return super(OptionParser, self).parse_args()\n' ' else:\n' ' import optcomp\n' ' optcomp.completion(self)\n' '\n' '\n' 'class Aaa(object):\n' ' """docstring"""\n' ' def __init__(self):\n' " self.__setattr__('a','b')\n" ' pass\n' '\n' ' def one_public(self):\n' ' """docstring"""\n' ' pass\n' '\n' ' def another_public(self):\n' ' """docstring"""\n' ' pass\n' '\n' 'class Ccc(Aaa):\n' ' """docstring"""\n' '\n' ' class Ddd(Aaa):\n' ' """docstring"""\n' ' pass\n' '\n' ' class Eee(Ddd):\n' ' """docstring"""\n' ' pass\n') tests/test_mark_tokens.py:269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:43: in create_mark_checker checker.verify_all_nodes(self) atok = checker = node = self = source = ('\n' '\n' 'try:\n' ' enumerate = enumerate\n' 'except NameError:\n' '\n' ' def enumerate(iterable):\n' ' """emulates the python2.3 enumerate() function"""\n' ' i = 0\n' ' for val in iterable:\n' ' yield i, val\n' ' i += 1\n' '\n' 'def toto(value):\n' ' for k, v in value:\n' " print(v.get('yo'))\n" '\n' '\n' 'import imp\n' "fp, mpath, desc = imp.find_module('optparse',a)\n" "s_opt = imp.load_module('std_optparse', fp, mpath, desc)\n" '\n' 'class OptionParser(s_opt.OptionParser):\n' '\n' ' def parse_args(self, args=None, values=None, real_optparse=False):\n' ' if real_optparse:\n' ' pass\n' '## return super(OptionParser, self).parse_args()\n' ' else:\n' ' import optcomp\n' ' optcomp.completion(self)\n' '\n' '\n' 'class Aaa(object):\n' ' """docstring"""\n' ' def __init__(self):\n' " self.__setattr__('a','b')\n" ' pass\n' '\n' ' def one_public(self):\n' ' """docstring"""\n' ' pass\n' '\n' ' def another_public(self):\n' ' """docstring"""\n' ' pass\n' '\n' 'class Ccc(Aaa):\n' ' """docstring"""\n' '\n' ' class Ddd(Aaa):\n' ' """docstring"""\n' ' pass\n' '\n' ' class Eee(Ddd):\n' ' """docstring"""\n' ' pass\n') verify = True tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Func[103 chars] 7,\n 4,\n 7,\n 17],\n de[1040 chars]])])" != "Func[103 chars] 2,\n 4,\n 2,\n 17],\n de[1040 chars]])])" E FunctionDef( E name='enumerate', E doc='emulates the python2.3 enumerate() function', E position=[ E - 7, E ? ^ E + 2, E ? ^ E 4, E - 7, E ? ^ E + 2, E ? ^ E 17], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[Name(name='iterable')], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[Assign( E targets=[Name(name='i')], E value=Const( E value=0, E kind=None)), E For( E target=Name(name='val'), E iter=Name(name='iterable'), E body=[Expr(value=Yield(value=Tuple( E ctx=, E elts=[Name(name='i'), Name(name='val')]))), E AugAssign( E op='+=', E target=Name(name='i'), E value=Const( E value=1, E kind=None))], E orelse=[])]) node = rebuilt_node = self = test_case = tested_nodes = 5 text = (' def enumerate(iterable):\n' ' """emulates the python2.3 enumerate() function"""\n' ' i = 0\n' ' for val in iterable:\n' ' yield i, val\n' ' i += 1') ____________________________ TestAstroid.test_splat ____________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 9 text = 'def print_all(a, b, c, d, e):\n print(a, b, c, d ,e)' tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'print_all'), ('doc', None), ...] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'print_all'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'print_all'), ('doc', None), ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13))] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'print_all'), ('doc', None), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=13))] vc1 = ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=13)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=13)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=13)) vc1 = Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=13) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=13) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=3, col_offset=0, end_lineno=3, end_col_offset=13) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=13) vc1 = 3 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 3, t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 3 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 3 second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 3 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 3 second = 2, msg = '3 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 3 != 2 first = 3 msg = '3 != 2' second = 2 self = standardMsg = '3 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = def test_splat(self): # See https://bitbucket.org/plas/thonny/issues/151/debugger-crashes-when-encountering-a-splat source = textwrap.dedent(""" arr = [1,2,3,4,5] def print_all(a, b, c, d, e): print(a, b, c, d ,e) print_all(*arr) """) > m = self.create_mark_checker(source) self = source = ('\n' 'arr = [1,2,3,4,5]\n' 'def print_all(a, b, c, d, e):\n' ' print(a, b, c, d ,e)\n' 'print_all(*arr)\n') tests/test_mark_tokens.py:334: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:43: in create_mark_checker checker.verify_all_nodes(self) atok = checker = node = self = source = ('\n' 'arr = [1,2,3,4,5]\n' 'def print_all(a, b, c, d, e):\n' ' print(a, b, c, d ,e)\n' 'print_all(*arr)\n') verify = True tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Func[62 chars] 3,\n 0,\n 3,\n 13],\n de[1005 chars]))])" != "Func[62 chars] 2,\n 0,\n 2,\n 13],\n de[1005 chars]))])" E FunctionDef( E name='print_all', E doc=None, E position=[ E - 3, E ? ^ E + 2, E ? ^ E 0, E - 3, E ? ^ E + 2, E ? ^ E 13], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[ E Name(name='a'), E Name(name='b'), E Name(name='c'), E Name(name='d'), E Name(name='e')], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[ E None, E None, E None, E None, E None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[ E None, E None, E None, E None, E None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[Expr(value=Call( E func=Name(name='print'), E args=[ E Name(name='a'), E Name(name='b'), E Name(name='c'), E Name(name='d'), E Name(name='e')], E keywords=[]))]) node = rebuilt_node = self = test_case = tested_nodes = 9 text = 'def print_all(a, b, c, d, e):\n print(a, b, c, d ,e)' _________________________ TestAstroid.test_sys_modules _________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 4 text = ('def _wrap(new, old):\n' ' """Simple substitute for functools.update_wrapper."""\n' " for replace in ['__module__', '__name__', '__qualname__', '__doc__']:\n" ' if hasattr(old, replace):\n' ' setattr(new, replace, getattr(old, replace))\n' ' new.__dict__.update(old.__dict__)') tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', [, ]), ('name', '_wrap'), ('doc', 'Simple substitute for functools.update_wrapper.'), ...] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', [, ]), ('name', '_wrap'), ('doc', 'Simple substitute for functools.update_wrapper.'), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', [, ]), ('name', '_wrap'), ('doc', 'Simple substitute for functools.update_wrapper.'), ('position', Position(lineno=27, col_offset=0, end_lineno=27, end_col_offset=9))] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', [, ]), ('name', '_wrap'), ('doc', 'Simple substitute for functools.update_wrapper.'), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9))] vc1 = ('position', Position(lineno=27, col_offset=0, end_lineno=27, end_col_offset=9)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=27, col_offset=0, end_lineno=27, end_col_offset=9)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=27, col_offset=0, end_lineno=27, end_col_offset=9)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9)) vc1 = Position(lineno=27, col_offset=0, end_lineno=27, end_col_offset=9) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=27, col_offset=0, end_lineno=27, end_col_offset=9) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=27, col_offset=0, end_lineno=27, end_col_offset=9) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=9) vc1 = 27 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 27 t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 27 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 27 second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 27 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 27 second = 2, msg = '27 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 27 != 2 first = 27 msg = '27 != 2' second = 2 self = standardMsg = '27 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = def test_sys_modules(self): """ Verify all nodes on source files obtained from sys.modules. This can take a long time as there are many modules, so it only tests all modules if the environment variable ASTTOKENS_SLOW_TESTS has been set. """ modules = list(sys.modules.values()) if not os.environ.get('ASTTOKENS_SLOW_TESTS'): modules = modules[:20] start = time() for module in modules: # Don't let this test (which runs twice) take longer than 13 minutes # to avoid the travis build time limit of 30 minutes if time() - start > 13 * 60: break try: filename = inspect.getsourcefile(module) except TypeError: continue if not filename: continue filename = os.path.abspath(filename) print(filename) try: with io.open(filename) as f: source = f.read() except OSError: continue # Astroid fails with a syntax error if a type comment is on its own line if self.is_astroid_test and re.search(r'^\s*# type: ', source, re.MULTILINE): print('Skipping', filename) continue > self.create_mark_checker(source) f = <_io.TextIOWrapper name='/usr/lib/python3.8/importlib/_bootstrap.py' mode='r' encoding='UTF-8'> filename = '/usr/lib/python3.8/importlib/_bootstrap.py' module = modules = [, , , , , , , , , , , , , , , , , , , ] self = source = ('"""Core implementation of import.\n' '\n' 'This module is NOT meant to be directly imported! It has been designed such\n' 'that it can be bootstrapped into Python as the implementation of import. As\n' 'such it requires the injection of specific modules and attributes in order ' 'to\n' 'work. One should use importlib as the public-facing version of this module.\n' '\n' '"""\n' '#\n' '# IMPORTANT: Whenever making changes to this module, be sure to run a ' 'top-level\n' '# `make regen-importlib` followed by `make` in order to get the frozen ' 'version\n' '# of the module updated. Not doing so will result in the Makefile to fail ' 'for\n' "# all others who don't have a ./python around to freeze the module\n" '# in the early stages of compilation.\n' '#\n' '\n' '# See importlib._setup() for what is injected into the global namespace.\n' '\n' '# When editing this code be aware that code executed at import time CANNOT\n' '# reference any injected objects! This includes not only global code but ' 'also\n' '# anything specified at the class level.\n' '\n' '# Bootstrap-related code ' '######################################################\n' '\n' '_bootstrap_external = None\n' '\n' 'def _wrap(new, old):\n' ' """Simple substitute for functools.update_wrapper."""\n' " for replace in ['__module__', '__name__', '__qualname__', '__doc__']:\n" ' if hasattr(old, replace):\n' ' setattr(new, replace, getattr(old, replace))\n' ' new.__dict__.update(old.__dict__)\n' '\n' '\n' 'def _new_module(name):\n' ' return type(sys)(name)\n' '\n' '\n' '# Module-level locking ' '########################################################\n' '\n' '# A dict mapping module names to weakrefs of _ModuleLock instances\n' '# Dictionary protected by the global import lock\n' '_module_locks = {}\n' '# A dict mapping thread ids to _ModuleLock instances\n' '_blocking_on = {}\n' '\n' '\n' 'class _DeadlockError(RuntimeError):\n' ' pass\n' '\n' '\n' 'class _ModuleLock:\n' ' """A recursive lock implementation which is able to detect deadlocks\n' ' (e.g. thread 1 trying to take locks A then B, and thread 2 trying to\n' ' take locks B then A).\n' ' """\n' '\n' ' def __init__(self, name):\n' ' self.lock = _thread.allocate_lock()\n' ' self.wakeup = _thread.allocate_lock()\n' ' self.name = name\n' ' self.owner = None\n' ' self.count = 0\n' ' self.waiters = 0\n' '\n' ' def has_deadlock(self):\n' ' # Deadlock avoidance for concurrent circular imports.\n' ' me = _thread.get_ident()\n' ' tid = self.owner\n' ' while True:\n' ' lock = _blocking_on.get(tid)\n' ' if lock is None:\n' ' return False\n' ' tid = lock.owner\n' ' if tid == me:\n' ' return True\n' '\n' ' def acquire(self):\n' ' """\n' ' Acquire the module lock. If a potential deadlock is detected,\n' ' a _DeadlockError is raised.\n' ' Otherwise, the lock is always acquired and True is returned.\n' ' """\n' ' tid = _thread.get_ident()\n' ' _blocking_on[tid] = self\n' ' try:\n' ' while True:\n' ' with self.lock:\n' ' if self.count == 0 or self.owner == tid:\n' ' self.owner = tid\n' ' self.count += 1\n' ' return True\n' ' if self.has_deadlock():\n' " raise _DeadlockError('deadlock detected by %r' % " 'self)\n' ' if self.wakeup.acquire(False):\n' ' self.waiters += 1\n' ' # Wait for a release() call\n' ' self.wakeup.acquire()\n' ' self.wakeup.release()\n' ' finally:\n' ' del _blocking_on[tid]\n' '\n' ' def release(self):\n' ' tid = _thread.get_ident()\n' ' with self.lock:\n' ' if self.owner != tid:\n' " raise RuntimeError('cannot release un-acquired lock')\n" ' assert self.count > 0\n' ' self.count -= 1\n' ' if self.count == 0:\n' ' self.owner = None\n' ' if self.waiters:\n' ' self.waiters -= 1\n' ' self.wakeup.release()\n' '\n' ' def __repr__(self):\n' " return '_ModuleLock({!r}) at {}'.format(self.name, id(self))\n" '\n' '\n' 'class _DummyModuleLock:\n' ' """A simple _ModuleLock equivalent for Python builds without\n' ' multi-threading support."""\n' '\n' ' def __init__(self, name):\n' ' self.name = name\n' ' self.count = 0\n' '\n' ' def acquire(self):\n' ' self.count += 1\n' ' return True\n' '\n' ' def release(self):\n' ' if self.count == 0:\n' " raise RuntimeError('cannot release un-acquired lock')\n" ' self.count -= 1\n' '\n' ' def __repr__(self):\n' " return '_DummyModuleLock({!r}) at {}'.format(self.name, id(self))\n" '\n' '\n' 'class _ModuleLockManager:\n' '\n' ' def __init__(self, name):\n' ' self._name = name\n' ' self._lock = None\n' '\n' ' def __enter__(self):\n' ' self._lock = _get_module_lock(self._name)\n' ' self._lock.acquire()\n' '\n' ' def __exit__(self, *args, **kwargs):\n' ' self._lock.release()\n' '\n' '\n' '# The following two functions are for consumption by Python/import.c.\n' '\n' 'def _get_module_lock(name):\n' ' """Get or create the module lock for a given module name.\n' '\n' ' Acquire/release internally the global import lock to protect\n' ' _module_locks."""\n' '\n' ' _imp.acquire_lock()\n' ' try:\n' ' try:\n' ' lock = _module_locks[name]()\n' ' except KeyError:\n' ' lock = None\n' '\n' ' if lock is None:\n' ' if _thread is None:\n' ' lock = _DummyModuleLock(name)\n' ' else:\n' ' lock = _ModuleLock(name)\n' '\n' ' def cb(ref, name=name):\n' ' _imp.acquire_lock()\n' ' try:\n' ' # bpo-31070: Check if another thread created a new lock\n' ' # after the previous lock was destroyed\n' ' # but before the weakref callback was called.\n' ' if _module_locks.get(name) is ref:\n' ' del _module_locks[name]\n' ' finally:\n' ' _imp.release_lock()\n' '\n' ' _module_locks[name] = _weakref.ref(lock, cb)\n' ' finally:\n' ' _imp.release_lock()\n' '\n' ' return lock\n' '\n' '\n' 'def _lock_unlock_module(name):\n' ' """Acquires then releases the module lock for a given module name.\n' '\n' ' This is used to ensure a module is completely initialized, in the\n' ' event it is being imported by another thread.\n' ' """\n' ' lock = _get_module_lock(name)\n' ' try:\n' ' lock.acquire()\n' ' except _DeadlockError:\n' " # Concurrent circular import, we'll accept a partially initialized\n" ' # module object.\n' ' pass\n' ' else:\n' ' lock.release()\n' '\n' '# Frame stripping magic ###############################################\n' 'def _call_with_frames_removed(f, *args, **kwds):\n' ' """remove_importlib_frames in import.c will always remove sequences\n' ' of importlib frames that end with a call to this function\n' '\n' ' Use it instead of a normal call in places where including the importlib\n' ' frames introduces unwanted noise into the traceback (e.g. when ' 'executing\n' ' module code)\n' ' """\n' ' return f(*args, **kwds)\n' '\n' '\n' 'def _verbose_message(message, *args, verbosity=1):\n' ' """Print the message to stderr if -v/PYTHONVERBOSE is turned on."""\n' ' if sys.flags.verbose >= verbosity:\n' " if not message.startswith(('#', 'import ')):\n" " message = '# ' + message\n" ' print(message.format(*args), file=sys.stderr)\n' '\n' '\n' 'def _requires_builtin(fxn):\n' ' """Decorator to verify the named module is built-in."""\n' ' def _requires_builtin_wrapper(self, fullname):\n' ' if fullname not in sys.builtin_module_names:\n' " raise ImportError('{!r} is not a built-in " "module'.format(fullname),\n" ' name=fullname)\n' ' return fxn(self, fullname)\n' ' _wrap(_requires_builtin_wrapper, fxn)\n' ' return _requires_builtin_wrapper\n' '\n' '\n' 'def _requires_frozen(fxn):\n' ' """Decorator to verify the named module is frozen."""\n' ' def _requires_frozen_wrapper(self, fullname):\n' ' if not _imp.is_frozen(fullname):\n' " raise ImportError('{!r} is not a frozen " "module'.format(fullname),\n" ' name=fullname)\n' ' return fxn(self, fullname)\n' ' _wrap(_requires_frozen_wrapper, fxn)\n' ' return _requires_frozen_wrapper\n' '\n' '\n' '# Typically used by loader classes as a method replacement.\n' 'def _load_module_shim(self, fullname):\n' ' """Load the specified module into sys.modules and return it.\n' '\n' ' This method is deprecated. Use loader.exec_module instead.\n' '\n' ' """\n' ' spec = spec_from_loader(fullname, self)\n' ' if fullname in sys.modules:\n' ' module = sys.modules[fullname]\n' ' _exec(spec, module)\n' ' return sys.modules[fullname]\n' ' else:\n' ' return _load(spec)\n' '\n' '# Module specifications ' '#######################################################\n' '\n' 'def _module_repr(module):\n' ' # The implementation of ModuleType.__repr__().\n' " loader = getattr(module, '__loader__', None)\n" " if hasattr(loader, 'module_repr'):\n" ' # As soon as BuiltinImporter, FrozenImporter, and NamespaceLoader\n' ' # drop their implementations for module_repr. we can add a\n' ' # deprecation warning here.\n' ' try:\n' ' return loader.module_repr(module)\n' ' except Exception:\n' ' pass\n' ' try:\n' ' spec = module.__spec__\n' ' except AttributeError:\n' ' pass\n' ' else:\n' ' if spec is not None:\n' ' return _module_repr_from_spec(spec)\n' '\n' " # We could use module.__class__.__name__ instead of 'module' in the\n" ' # various repr permutations.\n' ' try:\n' ' name = module.__name__\n' ' except AttributeError:\n' " name = '?'\n" ' try:\n' ' filename = module.__file__\n' ' except AttributeError:\n' ' if loader is None:\n' " return ''.format(name)\n" ' else:\n' " return ''.format(name, loader)\n" ' else:\n' " return ''.format(name, filename)\n" '\n' '\n' 'class ModuleSpec:\n' ' """The specification for a module, used for loading.\n' '\n' " A module's spec is the source for information about the module. For\n" " data associated with the module, including source, use the spec's\n" ' loader.\n' '\n' ' `name` is the absolute name of the module. `loader` is the loader\n' ' to use when loading the module. `parent` is the name of the\n' ' package the module is in. The parent is derived from the name.\n' '\n' ' `is_package` determines if the module is considered a package or\n' ' not. On modules this is reflected by the `__path__` attribute.\n' '\n' ' `origin` is the specific location used by the loader from which to\n' ' load the module, if that information is available. When filename is\n' ' set, origin will match.\n' '\n' ' `has_location` indicates that a spec\'s "origin" reflects a location.\n' ' When this is True, `__file__` attribute of the module is set.\n' '\n' ' `cached` is the location of the cached bytecode file, if any. It\n' ' corresponds to the `__cached__` attribute.\n' '\n' ' `submodule_search_locations` is the sequence of path entries to\n' ' search when importing submodules. If set, is_package should be\n' ' True--and False otherwise.\n' '\n' ' Packages are simply modules that (may) have submodules. If a spec\n' ' has a non-None value in `submodule_search_locations`, the import\n' ' system will consider modules loaded from the spec as packages.\n' '\n' ' Only finders (see importlib.abc.MetaPathFinder and\n' ' importlib.abc.PathEntryFinder) should modify ModuleSpec instances.\n' '\n' ' """\n' '\n' ' def __init__(self, name, loader, *, origin=None, loader_state=None,\n' ' is_package=None):\n' ' self.name = name\n' ' self.loader = loader\n' ' self.origin = origin\n' ' self.loader_state = loader_state\n' ' self.submodule_search_locations = [] if is_package else None\n' '\n' ' # file-location attributes\n' ' self._set_fileattr = False\n' ' self._cached = None\n' '\n' ' def __repr__(self):\n' " args = ['name={!r}'.format(self.name),\n" " 'loader={!r}'.format(self.loader)]\n" ' if self.origin is not None:\n' " args.append('origin={!r}'.format(self.origin))\n" ' if self.submodule_search_locations is not None:\n' " args.append('submodule_search_locations={}'\n" ' .format(self.submodule_search_locations))\n' " return '{}({})'.format(self.__class__.__name__, ', '.join(args))\n" '\n' ' def __eq__(self, other):\n' ' smsl = self.submodule_search_locations\n' ' try:\n' ' return (self.name == other.name and\n' ' self.loader == other.loader and\n' ' self.origin == other.origin and\n' ' smsl == other.submodule_search_locations and\n' ' self.cached == other.cached and\n' ' self.has_location == other.has_location)\n' ' except AttributeError:\n' ' return False\n' '\n' ' @property\n' ' def cached(self):\n' ' if self._cached is None:\n' ' if self.origin is not None and self._set_fileattr:\n' ' if _bootstrap_external is None:\n' ' raise NotImplementedError\n' ' self._cached = _bootstrap_external._get_cached(self.origin)\n' ' return self._cached\n' '\n' ' @cached.setter\n' ' def cached(self, cached):\n' ' self._cached = cached\n' '\n' ' @property\n' ' def parent(self):\n' ' """The name of the module\'s parent."""\n' ' if self.submodule_search_locations is None:\n' " return self.name.rpartition('.')[0]\n" ' else:\n' ' return self.name\n' '\n' ' @property\n' ' def has_location(self):\n' ' return self._set_fileattr\n' '\n' ' @has_location.setter\n' ' def has_location(self, value):\n' ' self._set_fileattr = bool(value)\n' '\n' '\n' 'def spec_from_loader(name, loader, *, origin=None, is_package=None):\n' ' """Return a module spec based on various loader methods."""\n' " if hasattr(loader, 'get_filename'):\n" ' if _bootstrap_external is None:\n' ' raise NotImplementedError\n' ' spec_from_file_location = ' '_bootstrap_external.spec_from_file_location\n' '\n' ' if is_package is None:\n' ' return spec_from_file_location(name, loader=loader)\n' ' search = [] if is_package else None\n' ' return spec_from_file_location(name, loader=loader,\n' ' submodule_search_locations=search)\n' '\n' ' if is_package is None:\n' " if hasattr(loader, 'is_package'):\n" ' try:\n' ' is_package = loader.is_package(name)\n' ' except ImportError:\n' ' is_package = None # aka, undefined\n' ' else:\n' ' # the default\n' ' is_package = False\n' '\n' ' return ModuleSpec(name, loader, origin=origin, is_package=is_package)\n' '\n' '\n' 'def _spec_from_module(module, loader=None, origin=None):\n' ' # This function is meant for use in _setup().\n' ' try:\n' ' spec = module.__spec__\n' ' except AttributeError:\n' ' pass\n' ' else:\n' ' if spec is not None:\n' ' return spec\n' '\n' ' name = module.__name__\n' ' if loader is None:\n' ' try:\n' ' loader = module.__loader__\n' ' except AttributeError:\n' ' # loader will stay None.\n' ' pass\n' ' try:\n' ' location = module.__file__\n' ' except AttributeError:\n' ' location = None\n' ' if origin is None:\n' ' if location is None:\n' ' try:\n' ' origin = loader._ORIGIN\n' ' except AttributeError:\n' ' origin = None\n' ' else:\n' ' origin = location\n' ' try:\n' ' cached = module.__cached__\n' ' except AttributeError:\n' ' cached = None\n' ' try:\n' ' submodule_search_locations = list(module.__path__)\n' ' except AttributeError:\n' ' submodule_search_locations = None\n' '\n' ' spec = ModuleSpec(name, loader, origin=origin)\n' ' spec._set_fileattr = False if location is None else True\n' ' spec.cached = cached\n' ' spec.submodule_search_locations = submodule_search_locations\n' ' return spec\n' '\n' '\n' 'def _init_module_attrs(spec, module, *, override=False):\n' ' # The passed-in module may be not support attribute assignment,\n' " # in which case we simply don't set the attributes.\n" ' # __name__\n' " if (override or getattr(module, '__name__', None) is None):\n" ' try:\n' ' module.__name__ = spec.name\n' ' except AttributeError:\n' ' pass\n' ' # __loader__\n' " if override or getattr(module, '__loader__', None) is None:\n" ' loader = spec.loader\n' ' if loader is None:\n' ' # A backward compatibility hack.\n' ' if spec.submodule_search_locations is not None:\n' ' if _bootstrap_external is None:\n' ' raise NotImplementedError\n' ' _NamespaceLoader = _bootstrap_external._NamespaceLoader\n' '\n' ' loader = _NamespaceLoader.__new__(_NamespaceLoader)\n' ' loader._path = spec.submodule_search_locations\n' ' spec.loader = loader\n' ' # While the docs say that module.__file__ is not set for\n' ' # built-in modules, and the code below will avoid setting it ' 'if\n' ' # spec.has_location is false, this is incorrect for ' 'namespace\n' ' # packages. Namespace packages have no location, but their\n' ' # __spec__.origin is None, and thus their module.__file__\n' ' # should also be None for consistency. While a bit of a ' 'hack,\n' ' # this is the best place to ensure this consistency.\n' ' #\n' ' # See # ' 'https://docs.python.org/3/library/importlib.html#importlib.abc.Loader.load_module\n' ' # and bpo-32305\n' ' module.__file__ = None\n' ' try:\n' ' module.__loader__ = loader\n' ' except AttributeError:\n' ' pass\n' ' # __package__\n' " if override or getattr(module, '__package__', None) is None:\n" ' try:\n' ' module.__package__ = spec.parent\n' ' except AttributeError:\n' ' pass\n' ' # __spec__\n' ' try:\n' ' module.__spec__ = spec\n' ' except AttributeError:\n' ' pass\n' ' # __path__\n' " if override or getattr(module, '__path__', None) is None:\n" ' if spec.submodule_search_locations is not None:\n' ' try:\n' ' module.__path__ = spec.submodule_search_locations\n' ' except AttributeError:\n' ' pass\n' ' # __file__/__cached__\n' ' if spec.has_location:\n' " if override or getattr(module, '__file__', None) is None:\n" ' try:\n' ' module.__file__ = spec.origin\n' ' except AttributeError:\n' ' pass\n' '\n' " if override or getattr(module, '__cached__', None) is None:\n" ' if spec.cached is not None:\n' ' try:\n' ' module.__cached__ = spec.cached\n' ' except AttributeError:\n' ' pass\n' ' return module\n' '\n' '\n' 'def module_from_spec(spec):\n' ' """Create a module based on the provided spec."""\n' ' # Typically loaders will not implement create_module().\n' ' module = None\n' " if hasattr(spec.loader, 'create_module'):\n" ' # If create_module() returns `None` then it means default\n' ' # module creation should be used.\n' ' module = spec.loader.create_module(spec)\n' " elif hasattr(spec.loader, 'exec_module'):\n" " raise ImportError('loaders that define exec_module() '\n" " 'must also define create_module()')\n" ' if module is None:\n' ' module = _new_module(spec.name)\n' ' _init_module_attrs(spec, module)\n' ' return module\n' '\n' '\n' 'def _module_repr_from_spec(spec):\n' ' """Return the repr to use for the module."""\n' ' # We mostly replicate _module_repr() using the spec attributes.\n' " name = '?' if spec.name is None else spec.name\n" ' if spec.origin is None:\n' ' if spec.loader is None:\n' " return ''.format(name)\n" ' else:\n' " return ''.format(name, spec.loader)\n" ' else:\n' ' if spec.has_location:\n' " return ''.format(name, spec.origin)\n" ' else:\n' " return ''.format(spec.name, spec.origin)\n" '\n' '\n' '# Used by importlib.reload() and _load_module_shim().\n' 'def _exec(spec, module):\n' ' """Execute the spec\'s specified module in an existing module\'s ' 'namespace."""\n' ' name = spec.name\n' ' with _ModuleLockManager(name):\n' ' if sys.modules.get(name) is not module:\n' " msg = 'module {!r} not in sys.modules'.format(name)\n" ' raise ImportError(msg, name=name)\n' ' try:\n' ' if spec.loader is None:\n' ' if spec.submodule_search_locations is None:\n' " raise ImportError('missing loader', name=spec.name)\n" ' # Namespace package.\n' ' _init_module_attrs(spec, module, override=True)\n' ' else:\n' ' _init_module_attrs(spec, module, override=True)\n' " if not hasattr(spec.loader, 'exec_module'):\n" ' # (issue19713) Once BuiltinImporter and ' 'ExtensionFileLoader\n' ' # have exec_module() implemented, we can add a ' 'deprecation\n' ' # warning here.\n' ' spec.loader.load_module(name)\n' ' else:\n' ' spec.loader.exec_module(module)\n' ' finally:\n' ' # Update the order of insertion into sys.modules for module\n' ' # clean-up at shutdown.\n' ' module = sys.modules.pop(spec.name)\n' ' sys.modules[spec.name] = module\n' ' return module\n' '\n' '\n' 'def _load_backward_compatible(spec):\n' ' # (issue19713) Once BuiltinImporter and ExtensionFileLoader\n' ' # have exec_module() implemented, we can add a deprecation\n' ' # warning here.\n' ' try:\n' ' spec.loader.load_module(spec.name)\n' ' except:\n' ' if spec.name in sys.modules:\n' ' module = sys.modules.pop(spec.name)\n' ' sys.modules[spec.name] = module\n' ' raise\n' ' # The module must be in sys.modules at this point!\n' ' # Move it to the end of sys.modules.\n' ' module = sys.modules.pop(spec.name)\n' ' sys.modules[spec.name] = module\n' " if getattr(module, '__loader__', None) is None:\n" ' try:\n' ' module.__loader__ = spec.loader\n' ' except AttributeError:\n' ' pass\n' " if getattr(module, '__package__', None) is None:\n" ' try:\n' ' # Since module.__path__ may not line up with\n' " # spec.submodule_search_paths, we can't necessarily rely\n" ' # on spec.parent here.\n' ' module.__package__ = module.__name__\n' " if not hasattr(module, '__path__'):\n" " module.__package__ = spec.name.rpartition('.')[0]\n" ' except AttributeError:\n' ' pass\n' " if getattr(module, '__spec__', None) is None:\n" ' try:\n' ' module.__spec__ = spec\n' ' except AttributeError:\n' ' pass\n' ' return module\n' '\n' 'def _load_unlocked(spec):\n' ' # A helper for direct use by the import system.\n' ' if spec.loader is not None:\n' ' # Not a namespace package.\n' " if not hasattr(spec.loader, 'exec_module'):\n" ' return _load_backward_compatible(spec)\n' '\n' ' module = module_from_spec(spec)\n' '\n' ' # This must be done before putting the module in sys.modules\n' ' # (otherwise an optimization shortcut in import.c becomes\n' ' # wrong).\n' ' spec._initializing = True\n' ' try:\n' ' sys.modules[spec.name] = module\n' ' try:\n' ' if spec.loader is None:\n' ' if spec.submodule_search_locations is None:\n' " raise ImportError('missing loader', name=spec.name)\n" ' # A namespace package so do nothing.\n' ' else:\n' ' spec.loader.exec_module(module)\n' ' except:\n' ' try:\n' ' del sys.modules[spec.name]\n' ' except KeyError:\n' ' pass\n' ' raise\n' ' # Move the module to the end of sys.modules.\n' " # We don't ensure that the import-related module attributes get\n" ' # set in the sys.modules replacement case. Such modules are on\n' ' # their own.\n' ' module = sys.modules.pop(spec.name)\n' ' sys.modules[spec.name] = module\n' " _verbose_message('import {!r} # {!r}', spec.name, spec.loader)\n" ' finally:\n' ' spec._initializing = False\n' '\n' ' return module\n' '\n' '# A method used during testing of _load_unlocked() and by\n' '# _load_module_shim().\n' 'def _load(spec):\n' ' """Return a new module object, loaded by the spec\'s loader.\n' '\n' ' The module is not added to its parent.\n' '\n' ' If a module is already in sys.modules, that existing module gets\n' ' clobbered.\n' '\n' ' """\n' ' with _ModuleLockManager(spec.name):\n' ' return _load_unlocked(spec)\n' '\n' '\n' '# Loaders ' '#####################################################################\n' '\n' 'class BuiltinImporter:\n' '\n' ' """Meta path import for built-in modules.\n' '\n' ' All methods are either class or static methods to avoid the need to\n' ' instantiate the class.\n' '\n' ' """\n' '\n' ' @staticmethod\n' ' def module_repr(module):\n' ' """Return repr for the module.\n' '\n' ' The method is deprecated. The import machinery does the job ' 'itself.\n' '\n' ' """\n' " return ''.format(module.__name__)\n" '\n' ' @classmethod\n' ' def find_spec(cls, fullname, path=None, target=None):\n' ' if path is not None:\n' ' return None\n' ' if _imp.is_builtin(fullname):\n' " return spec_from_loader(fullname, cls, origin='built-in')\n" ' else:\n' ' return None\n' '\n' ' @classmethod\n' ' def find_module(cls, fullname, path=None):\n' ' """Find the built-in module.\n' '\n' " If 'path' is ever specified then the search is considered a " 'failure.\n' '\n' ' This method is deprecated. Use find_spec() instead.\n' '\n' ' """\n' ' spec = cls.find_spec(fullname, path)\n' ' return spec.loader if spec is not None else None\n' '\n' ' @classmethod\n' ' def create_module(self, spec):\n' ' """Create a built-in module"""\n' ' if spec.name not in sys.builtin_module_names:\n' " raise ImportError('{!r} is not a built-in " "module'.format(spec.name),\n" ' name=spec.name)\n' ' return _call_with_frames_removed(_imp.create_builtin, spec)\n' '\n' ' @classmethod\n' ' def exec_module(self, module):\n' ' """Exec a built-in module"""\n' ' _call_with_frames_removed(_imp.exec_builtin, module)\n' '\n' ' @classmethod\n' ' @_requires_builtin\n' ' def get_code(cls, fullname):\n' ' """Return None as built-in modules do not have code objects."""\n' ' return None\n' '\n' ' @classmethod\n' ' @_requires_builtin\n' ' def get_source(cls, fullname):\n' ' """Return None as built-in modules do not have source code."""\n' ' return None\n' '\n' ' @classmethod\n' ' @_requires_builtin\n' ' def is_package(cls, fullname):\n' ' """Return False as built-in modules are never packages."""\n' ' return False\n' '\n' ' load_module = classmethod(_load_module_shim)\n' '\n' '\n' 'class FrozenImporter:\n' '\n' ' """Meta path import for frozen modules.\n' '\n' ' All methods are either class or static methods to avoid the need to\n' ' instantiate the class.\n' '\n' ' """\n' '\n' ' _ORIGIN = "frozen"\n' '\n' ' @staticmethod\n' ' def module_repr(m):\n' ' """Return repr for the module.\n' '\n' ' The method is deprecated. The import machinery does the job ' 'itself.\n' '\n' ' """\n' " return ''.format(m.__name__, " 'FrozenImporter._ORIGIN)\n' '\n' ' @classmethod\n' ' def find_spec(cls, fullname, path=None, target=None):\n' ' if _imp.is_frozen(fullname):\n' ' return spec_from_loader(fullname, cls, origin=cls._ORIGIN)\n' ' else:\n' ' return None\n' '\n' ' @classmethod\n' ' def find_module(cls, fullname, path=None):\n' ' """Find a frozen module.\n' '\n' ' This method is deprecated. Use find_spec() instead.\n' '\n' ' """\n' ' return cls if _imp.is_frozen(fullname) else None\n' '\n' ' @classmethod\n' ' def create_module(cls, spec):\n' ' """Use default semantics for module creation."""\n' '\n' ' @staticmethod\n' ' def exec_module(module):\n' ' name = module.__spec__.name\n' ' if not _imp.is_frozen(name):\n' " raise ImportError('{!r} is not a frozen module'.format(name),\n" ' name=name)\n' ' code = _call_with_frames_removed(_imp.get_frozen_object, name)\n' ' exec(code, module.__dict__)\n' '\n' ' @classmethod\n' ' def load_module(cls, fullname):\n' ' """Load a frozen module.\n' '\n' ' This method is deprecated. Use exec_module() instead.\n' '\n' ' """\n' ' return _load_module_shim(cls, fullname)\n' '\n' ' @classmethod\n' ' @_requires_frozen\n' ' def get_code(cls, fullname):\n' ' """Return the code object for the frozen module."""\n' ' return _imp.get_frozen_object(fullname)\n' '\n' ' @classmethod\n' ' @_requires_frozen\n' ' def get_source(cls, fullname):\n' ' """Return None as frozen modules do not have source code."""\n' ' return None\n' '\n' ' @classmethod\n' ' @_requires_frozen\n' ' def is_package(cls, fullname):\n' ' """Return True if the frozen module is a package."""\n' ' return _imp.is_frozen_package(fullname)\n' '\n' '\n' '# Import itself ' '###############################################################\n' '\n' 'class _ImportLockContext:\n' '\n' ' """Context manager for the import lock."""\n' '\n' ' def __enter__(self):\n' ' """Acquire the import lock."""\n' ' _imp.acquire_lock()\n' '\n' ' def __exit__(self, exc_type, exc_value, exc_traceback):\n' ' """Release the import lock regardless of any raised exceptions."""\n' ' _imp.release_lock()\n' '\n' '\n' 'def _resolve_name(name, package, level):\n' ' """Resolve a relative module name to an absolute one."""\n' " bits = package.rsplit('.', level - 1)\n" ' if len(bits) < level:\n' " raise ValueError('attempted relative import beyond top-level " "package')\n" ' base = bits[0]\n' " return '{}.{}'.format(base, name) if name else base\n" '\n' '\n' 'def _find_spec_legacy(finder, name, path):\n' ' # This would be a good place for a DeprecationWarning if\n' ' # we ended up going that route.\n' ' loader = finder.find_module(name, path)\n' ' if loader is None:\n' ' return None\n' ' return spec_from_loader(name, loader)\n' '\n' '\n' 'def _find_spec(name, path, target=None):\n' ' """Find a module\'s spec."""\n' ' meta_path = sys.meta_path\n' ' if meta_path is None:\n' ' # PyImport_Cleanup() is running or has been called.\n' ' raise ImportError("sys.meta_path is None, Python is likely "\n' ' "shutting down")\n' '\n' ' if not meta_path:\n' " _warnings.warn('sys.meta_path is empty', ImportWarning)\n" '\n' ' # We check sys.modules here for the reload case. While a passed-in\n' ' # target will usually indicate a reload there is no guarantee, whereas\n' ' # sys.modules provides one.\n' ' is_reload = name in sys.modules\n' ' for finder in meta_path:\n' ' with _ImportLockContext():\n' ' try:\n' ' find_spec = finder.find_spec\n' ' except AttributeError:\n' ' spec = _find_spec_legacy(finder, name, path)\n' ' if spec is None:\n' ' continue\n' ' else:\n' ' spec = find_spec(name, path, target)\n' ' if spec is not None:\n' ' # The parent import may have already imported this module.\n' ' if not is_reload and name in sys.modules:\n' ' module = sys.modules[name]\n' ' try:\n' ' __spec__ = module.__spec__\n' ' except AttributeError:\n' ' # We use the found spec since that is the one that\n' " # we would have used if the parent module hadn't\n" ' # beaten us to the punch.\n' ' return spec\n' ' else:\n' ' if __spec__ is None:\n' ' return spec\n' ' else:\n' ' return __spec__\n' ' else:\n' ' return spec\n' ' else:\n' ' return None\n' '\n' '\n' 'def _sanity_check(name, package, level):\n' ' """Verify arguments are "sane"."""\n' ' if not isinstance(name, str):\n' " raise TypeError('module name must be str, not " "{}'.format(type(name)))\n" ' if level < 0:\n' " raise ValueError('level must be >= 0')\n" ' if level > 0:\n' ' if not isinstance(package, str):\n' " raise TypeError('__package__ not set to a string')\n" ' elif not package:\n' " raise ImportError('attempted relative import with no known " "parent '\n" " 'package')\n" ' if not name and level == 0:\n' " raise ValueError('Empty module name')\n" '\n' '\n' "_ERR_MSG_PREFIX = 'No module named '\n" "_ERR_MSG = _ERR_MSG_PREFIX + '{!r}'\n" '\n' 'def _find_and_load_unlocked(name, import_):\n' ' path = None\n' " parent = name.rpartition('.')[0]\n" ' if parent:\n' ' if parent not in sys.modules:\n' ' _call_with_frames_removed(import_, parent)\n' ' # Crazy side-effects!\n' ' if name in sys.modules:\n' ' return sys.modules[name]\n' ' parent_module = sys.modules[parent]\n' ' try:\n' ' path = parent_module.__path__\n' ' except AttributeError:\n' " msg = (_ERR_MSG + '; {!r} is not a package').format(name, " 'parent)\n' ' raise ModuleNotFoundError(msg, name=name) from None\n' ' spec = _find_spec(name, path)\n' ' if spec is None:\n' ' raise ModuleNotFoundError(_ERR_MSG.format(name), name=name)\n' ' else:\n' ' module = _load_unlocked(spec)\n' ' if parent:\n' ' # Set the module as an attribute on its parent.\n' ' parent_module = sys.modules[parent]\n' " setattr(parent_module, name.rpartition('.')[2], module)\n" ' return module\n' '\n' '\n' '_NEEDS_LOADING = object()\n' '\n' '\n' 'def _find_and_load(name, import_):\n' ' """Find and load the module."""\n' ' with _ModuleLockManager(name):\n' ' module = sys.modules.get(name, _NEEDS_LOADING)\n' ' if module is _NEEDS_LOADING:\n' ' return _find_and_load_unlocked(name, import_)\n' '\n' ' if module is None:\n' " message = ('import of {} halted; '\n" " 'None in sys.modules'.format(name))\n" ' raise ModuleNotFoundError(message, name=name)\n' '\n' ' _lock_unlock_module(name)\n' ' return module\n' '\n' '\n' 'def _gcd_import(name, package=None, level=0):\n' ' """Import and return the module based on its name, the package the call ' 'is\n' ' being made from, and the level adjustment.\n' '\n' ' This function represents the greatest common denominator of ' 'functionality\n' ' between import_module and __import__. This includes setting __package__ ' 'if\n' ' the loader did not.\n' '\n' ' """\n' ' _sanity_check(name, package, level)\n' ' if level > 0:\n' ' name = _resolve_name(name, package, level)\n' ' return _find_and_load(name, _gcd_import)\n' '\n' '\n' 'def _handle_fromlist(module, fromlist, import_, *, recursive=False):\n' ' """Figure out what __import__ should return.\n' '\n' ' The import_ parameter is a callable which takes the name of module to\n' ' import. It is required to decouple the function from assuming ' "importlib's\n" ' import implementation is desired.\n' '\n' ' """\n' ' # The hell that is fromlist ...\n' ' # If a package was imported, try to import stuff from fromlist.\n' ' for x in fromlist:\n' ' if not isinstance(x, str):\n' ' if recursive:\n' " where = module.__name__ + '.__all__'\n" ' else:\n' ' where = "``from list\'\'"\n' ' raise TypeError(f"Item in {where} must be str, "\n' ' f"not {type(x).__name__}")\n' " elif x == '*':\n" " if not recursive and hasattr(module, '__all__'):\n" ' _handle_fromlist(module, module.__all__, import_,\n' ' recursive=True)\n' ' elif not hasattr(module, x):\n' " from_name = '{}.{}'.format(module.__name__, x)\n" ' try:\n' ' _call_with_frames_removed(import_, from_name)\n' ' except ModuleNotFoundError as exc:\n' ' # Backwards-compatibility dictates we ignore failed\n' " # imports triggered by fromlist for modules that don't\n" ' # exist.\n' ' if (exc.name == from_name and\n' ' sys.modules.get(from_name, _NEEDS_LOADING) is not ' 'None):\n' ' continue\n' ' raise\n' ' return module\n' '\n' '\n' 'def _calc___package__(globals):\n' ' """Calculate what __package__ should be.\n' '\n' ' __package__ is not guaranteed to be defined or could be set to None\n' ' to represent that its proper value is unknown.\n' '\n' ' """\n' " package = globals.get('__package__')\n" " spec = globals.get('__spec__')\n" ' if package is not None:\n' ' if spec is not None and package != spec.parent:\n' ' _warnings.warn("__package__ != __spec__.parent "\n' ' f"({package!r} != {spec.parent!r})",\n' ' ImportWarning, stacklevel=3)\n' ' return package\n' ' elif spec is not None:\n' ' return spec.parent\n' ' else:\n' ' _warnings.warn("can\'t resolve package from __spec__ or __package__, ' '"\n' ' "falling back on __name__ and __path__",\n' ' ImportWarning, stacklevel=3)\n' " package = globals['__name__']\n" " if '__path__' not in globals:\n" " package = package.rpartition('.')[0]\n" ' return package\n' '\n' '\n' 'def __import__(name, globals=None, locals=None, fromlist=(), level=0):\n' ' """Import a module.\n' '\n' " The 'globals' argument is used to infer where the import is occurring " 'from\n' " to handle relative imports. The 'locals' argument is ignored. The\n" " 'fromlist' argument specifies what should exist as attributes on the " 'module\n' " being imported (e.g. ``from module import ``). The 'level'\n" ' argument represents the package location to import from in a relative\n' " import (e.g. ``from ..pkg import mod`` would have a 'level' of 2).\n" '\n' ' """\n' ' if level == 0:\n' ' module = _gcd_import(name)\n' ' else:\n' ' globals_ = globals if globals is not None else {}\n' ' package = _calc___package__(globals_)\n' ' module = _gcd_import(name, package, level)\n' ' if not fromlist:\n' " # Return up to the first dot in 'name'. This is complicated by the " 'fact\n' " # that 'name' may be relative.\n" ' if level == 0:\n' " return _gcd_import(name.partition('.')[0])\n" ' elif not name:\n' ' return module\n' ' else:\n' " # Figure out where to slice the module's name up to the first " 'dot\n' " # in 'name'.\n" " cut_off = len(name) - len(name.partition('.')[0])\n" ' # Slice end needs to be positive to alleviate need to ' 'special-case\n' " # when ``'.' not in name``.\n" ' return ' 'sys.modules[module.__name__[:len(module.__name__)-cut_off]]\n' " elif hasattr(module, '__path__'):\n" ' return _handle_fromlist(module, fromlist, _gcd_import)\n' ' else:\n' ' return module\n' '\n' '\n' 'def _builtin_from_name(name):\n' ' spec = BuiltinImporter.find_spec(name)\n' ' if spec is None:\n' " raise ImportError('no built-in module named ' + name)\n" ' return _load_unlocked(spec)\n' '\n' '\n' 'def _setup(sys_module, _imp_module):\n' ' """Setup importlib by importing needed built-in modules and injecting ' 'them\n' ' into the global namespace.\n' '\n' ' As sys is needed for sys.modules access and _imp is needed to load ' 'built-in\n' ' modules, those two modules must be explicitly passed in.\n' '\n' ' """\n' ' global _imp, sys\n' ' _imp = _imp_module\n' ' sys = sys_module\n' '\n' ' # Set up the spec for existing builtin/frozen modules.\n' ' module_type = type(sys)\n' ' for name, module in sys.modules.items():\n' ' if isinstance(module, module_type):\n' ' if name in sys.builtin_module_names:\n' ' loader = BuiltinImporter\n' ' elif _imp.is_frozen(name):\n' ' loader = FrozenImporter\n' ' else:\n' ' continue\n' ' spec = _spec_from_module(module, loader)\n' ' _init_module_attrs(spec, module)\n' '\n' ' # Directly load built-in modules needed during bootstrap.\n' ' self_module = sys.modules[__name__]\n' " for builtin_name in ('_thread', '_warnings', '_weakref'):\n" ' if builtin_name not in sys.modules:\n' ' builtin_module = _builtin_from_name(builtin_name)\n' ' else:\n' ' builtin_module = sys.modules[builtin_name]\n' ' setattr(self_module, builtin_name, builtin_module)\n' '\n' '\n' 'def _install(sys_module, _imp_module):\n' ' """Install importers for builtin and frozen modules"""\n' ' _setup(sys_module, _imp_module)\n' '\n' ' sys.meta_path.append(BuiltinImporter)\n' ' sys.meta_path.append(FrozenImporter)\n' '\n' '\n' 'def _install_external_importers():\n' ' """Install importers that require external filesystem access"""\n' ' global _bootstrap_external\n' ' import _frozen_importlib_external\n' ' _bootstrap_external = _frozen_importlib_external\n' ' _frozen_importlib_external._install(sys.modules[__name__])\n') start = 1646367782.3532834 tests/test_mark_tokens.py:639: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:43: in create_mark_checker checker.verify_all_nodes(self) atok = checker = node = self = source = ('"""Core implementation of import.\n' '\n' 'This module is NOT meant to be directly imported! It has been designed such\n' 'that it can be bootstrapped into Python as the implementation of import. As\n' 'such it requires the injection of specific modules and attributes in order ' 'to\n' 'work. One should use importlib as the public-facing version of this module.\n' '\n' '"""\n' '#\n' '# IMPORTANT: Whenever making changes to this module, be sure to run a ' 'top-level\n' '# `make regen-importlib` followed by `make` in order to get the frozen ' 'version\n' '# of the module updated. Not doing so will result in the Makefile to fail ' 'for\n' "# all others who don't have a ./python around to freeze the module\n" '# in the early stages of compilation.\n' '#\n' '\n' '# See importlib._setup() for what is injected into the global namespace.\n' '\n' '# When editing this code be aware that code executed at import time CANNOT\n' '# reference any injected objects! This includes not only global code but ' 'also\n' '# anything specified at the class level.\n' '\n' '# Bootstrap-related code ' '######################################################\n' '\n' '_bootstrap_external = None\n' '\n' 'def _wrap(new, old):\n' ' """Simple substitute for functools.update_wrapper."""\n' " for replace in ['__module__', '__name__', '__qualname__', '__doc__']:\n" ' if hasattr(old, replace):\n' ' setattr(new, replace, getattr(old, replace))\n' ' new.__dict__.update(old.__dict__)\n' '\n' '\n' 'def _new_module(name):\n' ' return type(sys)(name)\n' '\n' '\n' '# Module-level locking ' '########################################################\n' '\n' '# A dict mapping module names to weakrefs of _ModuleLock instances\n' '# Dictionary protected by the global import lock\n' '_module_locks = {}\n' '# A dict mapping thread ids to _ModuleLock instances\n' '_blocking_on = {}\n' '\n' '\n' 'class _DeadlockError(RuntimeError):\n' ' pass\n' '\n' '\n' 'class _ModuleLock:\n' ' """A recursive lock implementation which is able to detect deadlocks\n' ' (e.g. thread 1 trying to take locks A then B, and thread 2 trying to\n' ' take locks B then A).\n' ' """\n' '\n' ' def __init__(self, name):\n' ' self.lock = _thread.allocate_lock()\n' ' self.wakeup = _thread.allocate_lock()\n' ' self.name = name\n' ' self.owner = None\n' ' self.count = 0\n' ' self.waiters = 0\n' '\n' ' def has_deadlock(self):\n' ' # Deadlock avoidance for concurrent circular imports.\n' ' me = _thread.get_ident()\n' ' tid = self.owner\n' ' while True:\n' ' lock = _blocking_on.get(tid)\n' ' if lock is None:\n' ' return False\n' ' tid = lock.owner\n' ' if tid == me:\n' ' return True\n' '\n' ' def acquire(self):\n' ' """\n' ' Acquire the module lock. If a potential deadlock is detected,\n' ' a _DeadlockError is raised.\n' ' Otherwise, the lock is always acquired and True is returned.\n' ' """\n' ' tid = _thread.get_ident()\n' ' _blocking_on[tid] = self\n' ' try:\n' ' while True:\n' ' with self.lock:\n' ' if self.count == 0 or self.owner == tid:\n' ' self.owner = tid\n' ' self.count += 1\n' ' return True\n' ' if self.has_deadlock():\n' " raise _DeadlockError('deadlock detected by %r' % " 'self)\n' ' if self.wakeup.acquire(False):\n' ' self.waiters += 1\n' ' # Wait for a release() call\n' ' self.wakeup.acquire()\n' ' self.wakeup.release()\n' ' finally:\n' ' del _blocking_on[tid]\n' '\n' ' def release(self):\n' ' tid = _thread.get_ident()\n' ' with self.lock:\n' ' if self.owner != tid:\n' " raise RuntimeError('cannot release un-acquired lock')\n" ' assert self.count > 0\n' ' self.count -= 1\n' ' if self.count == 0:\n' ' self.owner = None\n' ' if self.waiters:\n' ' self.waiters -= 1\n' ' self.wakeup.release()\n' '\n' ' def __repr__(self):\n' " return '_ModuleLock({!r}) at {}'.format(self.name, id(self))\n" '\n' '\n' 'class _DummyModuleLock:\n' ' """A simple _ModuleLock equivalent for Python builds without\n' ' multi-threading support."""\n' '\n' ' def __init__(self, name):\n' ' self.name = name\n' ' self.count = 0\n' '\n' ' def acquire(self):\n' ' self.count += 1\n' ' return True\n' '\n' ' def release(self):\n' ' if self.count == 0:\n' " raise RuntimeError('cannot release un-acquired lock')\n" ' self.count -= 1\n' '\n' ' def __repr__(self):\n' " return '_DummyModuleLock({!r}) at {}'.format(self.name, id(self))\n" '\n' '\n' 'class _ModuleLockManager:\n' '\n' ' def __init__(self, name):\n' ' self._name = name\n' ' self._lock = None\n' '\n' ' def __enter__(self):\n' ' self._lock = _get_module_lock(self._name)\n' ' self._lock.acquire()\n' '\n' ' def __exit__(self, *args, **kwargs):\n' ' self._lock.release()\n' '\n' '\n' '# The following two functions are for consumption by Python/import.c.\n' '\n' 'def _get_module_lock(name):\n' ' """Get or create the module lock for a given module name.\n' '\n' ' Acquire/release internally the global import lock to protect\n' ' _module_locks."""\n' '\n' ' _imp.acquire_lock()\n' ' try:\n' ' try:\n' ' lock = _module_locks[name]()\n' ' except KeyError:\n' ' lock = None\n' '\n' ' if lock is None:\n' ' if _thread is None:\n' ' lock = _DummyModuleLock(name)\n' ' else:\n' ' lock = _ModuleLock(name)\n' '\n' ' def cb(ref, name=name):\n' ' _imp.acquire_lock()\n' ' try:\n' ' # bpo-31070: Check if another thread created a new lock\n' ' # after the previous lock was destroyed\n' ' # but before the weakref callback was called.\n' ' if _module_locks.get(name) is ref:\n' ' del _module_locks[name]\n' ' finally:\n' ' _imp.release_lock()\n' '\n' ' _module_locks[name] = _weakref.ref(lock, cb)\n' ' finally:\n' ' _imp.release_lock()\n' '\n' ' return lock\n' '\n' '\n' 'def _lock_unlock_module(name):\n' ' """Acquires then releases the module lock for a given module name.\n' '\n' ' This is used to ensure a module is completely initialized, in the\n' ' event it is being imported by another thread.\n' ' """\n' ' lock = _get_module_lock(name)\n' ' try:\n' ' lock.acquire()\n' ' except _DeadlockError:\n' " # Concurrent circular import, we'll accept a partially initialized\n" ' # module object.\n' ' pass\n' ' else:\n' ' lock.release()\n' '\n' '# Frame stripping magic ###############################################\n' 'def _call_with_frames_removed(f, *args, **kwds):\n' ' """remove_importlib_frames in import.c will always remove sequences\n' ' of importlib frames that end with a call to this function\n' '\n' ' Use it instead of a normal call in places where including the importlib\n' ' frames introduces unwanted noise into the traceback (e.g. when ' 'executing\n' ' module code)\n' ' """\n' ' return f(*args, **kwds)\n' '\n' '\n' 'def _verbose_message(message, *args, verbosity=1):\n' ' """Print the message to stderr if -v/PYTHONVERBOSE is turned on."""\n' ' if sys.flags.verbose >= verbosity:\n' " if not message.startswith(('#', 'import ')):\n" " message = '# ' + message\n" ' print(message.format(*args), file=sys.stderr)\n' '\n' '\n' 'def _requires_builtin(fxn):\n' ' """Decorator to verify the named module is built-in."""\n' ' def _requires_builtin_wrapper(self, fullname):\n' ' if fullname not in sys.builtin_module_names:\n' " raise ImportError('{!r} is not a built-in " "module'.format(fullname),\n" ' name=fullname)\n' ' return fxn(self, fullname)\n' ' _wrap(_requires_builtin_wrapper, fxn)\n' ' return _requires_builtin_wrapper\n' '\n' '\n' 'def _requires_frozen(fxn):\n' ' """Decorator to verify the named module is frozen."""\n' ' def _requires_frozen_wrapper(self, fullname):\n' ' if not _imp.is_frozen(fullname):\n' " raise ImportError('{!r} is not a frozen " "module'.format(fullname),\n" ' name=fullname)\n' ' return fxn(self, fullname)\n' ' _wrap(_requires_frozen_wrapper, fxn)\n' ' return _requires_frozen_wrapper\n' '\n' '\n' '# Typically used by loader classes as a method replacement.\n' 'def _load_module_shim(self, fullname):\n' ' """Load the specified module into sys.modules and return it.\n' '\n' ' This method is deprecated. Use loader.exec_module instead.\n' '\n' ' """\n' ' spec = spec_from_loader(fullname, self)\n' ' if fullname in sys.modules:\n' ' module = sys.modules[fullname]\n' ' _exec(spec, module)\n' ' return sys.modules[fullname]\n' ' else:\n' ' return _load(spec)\n' '\n' '# Module specifications ' '#######################################################\n' '\n' 'def _module_repr(module):\n' ' # The implementation of ModuleType.__repr__().\n' " loader = getattr(module, '__loader__', None)\n" " if hasattr(loader, 'module_repr'):\n" ' # As soon as BuiltinImporter, FrozenImporter, and NamespaceLoader\n' ' # drop their implementations for module_repr. we can add a\n' ' # deprecation warning here.\n' ' try:\n' ' return loader.module_repr(module)\n' ' except Exception:\n' ' pass\n' ' try:\n' ' spec = module.__spec__\n' ' except AttributeError:\n' ' pass\n' ' else:\n' ' if spec is not None:\n' ' return _module_repr_from_spec(spec)\n' '\n' " # We could use module.__class__.__name__ instead of 'module' in the\n" ' # various repr permutations.\n' ' try:\n' ' name = module.__name__\n' ' except AttributeError:\n' " name = '?'\n" ' try:\n' ' filename = module.__file__\n' ' except AttributeError:\n' ' if loader is None:\n' " return ''.format(name)\n" ' else:\n' " return ''.format(name, loader)\n" ' else:\n' " return ''.format(name, filename)\n" '\n' '\n' 'class ModuleSpec:\n' ' """The specification for a module, used for loading.\n' '\n' " A module's spec is the source for information about the module. For\n" " data associated with the module, including source, use the spec's\n" ' loader.\n' '\n' ' `name` is the absolute name of the module. `loader` is the loader\n' ' to use when loading the module. `parent` is the name of the\n' ' package the module is in. The parent is derived from the name.\n' '\n' ' `is_package` determines if the module is considered a package or\n' ' not. On modules this is reflected by the `__path__` attribute.\n' '\n' ' `origin` is the specific location used by the loader from which to\n' ' load the module, if that information is available. When filename is\n' ' set, origin will match.\n' '\n' ' `has_location` indicates that a spec\'s "origin" reflects a location.\n' ' When this is True, `__file__` attribute of the module is set.\n' '\n' ' `cached` is the location of the cached bytecode file, if any. It\n' ' corresponds to the `__cached__` attribute.\n' '\n' ' `submodule_search_locations` is the sequence of path entries to\n' ' search when importing submodules. If set, is_package should be\n' ' True--and False otherwise.\n' '\n' ' Packages are simply modules that (may) have submodules. If a spec\n' ' has a non-None value in `submodule_search_locations`, the import\n' ' system will consider modules loaded from the spec as packages.\n' '\n' ' Only finders (see importlib.abc.MetaPathFinder and\n' ' importlib.abc.PathEntryFinder) should modify ModuleSpec instances.\n' '\n' ' """\n' '\n' ' def __init__(self, name, loader, *, origin=None, loader_state=None,\n' ' is_package=None):\n' ' self.name = name\n' ' self.loader = loader\n' ' self.origin = origin\n' ' self.loader_state = loader_state\n' ' self.submodule_search_locations = [] if is_package else None\n' '\n' ' # file-location attributes\n' ' self._set_fileattr = False\n' ' self._cached = None\n' '\n' ' def __repr__(self):\n' " args = ['name={!r}'.format(self.name),\n" " 'loader={!r}'.format(self.loader)]\n" ' if self.origin is not None:\n' " args.append('origin={!r}'.format(self.origin))\n" ' if self.submodule_search_locations is not None:\n' " args.append('submodule_search_locations={}'\n" ' .format(self.submodule_search_locations))\n' " return '{}({})'.format(self.__class__.__name__, ', '.join(args))\n" '\n' ' def __eq__(self, other):\n' ' smsl = self.submodule_search_locations\n' ' try:\n' ' return (self.name == other.name and\n' ' self.loader == other.loader and\n' ' self.origin == other.origin and\n' ' smsl == other.submodule_search_locations and\n' ' self.cached == other.cached and\n' ' self.has_location == other.has_location)\n' ' except AttributeError:\n' ' return False\n' '\n' ' @property\n' ' def cached(self):\n' ' if self._cached is None:\n' ' if self.origin is not None and self._set_fileattr:\n' ' if _bootstrap_external is None:\n' ' raise NotImplementedError\n' ' self._cached = _bootstrap_external._get_cached(self.origin)\n' ' return self._cached\n' '\n' ' @cached.setter\n' ' def cached(self, cached):\n' ' self._cached = cached\n' '\n' ' @property\n' ' def parent(self):\n' ' """The name of the module\'s parent."""\n' ' if self.submodule_search_locations is None:\n' " return self.name.rpartition('.')[0]\n" ' else:\n' ' return self.name\n' '\n' ' @property\n' ' def has_location(self):\n' ' return self._set_fileattr\n' '\n' ' @has_location.setter\n' ' def has_location(self, value):\n' ' self._set_fileattr = bool(value)\n' '\n' '\n' 'def spec_from_loader(name, loader, *, origin=None, is_package=None):\n' ' """Return a module spec based on various loader methods."""\n' " if hasattr(loader, 'get_filename'):\n" ' if _bootstrap_external is None:\n' ' raise NotImplementedError\n' ' spec_from_file_location = ' '_bootstrap_external.spec_from_file_location\n' '\n' ' if is_package is None:\n' ' return spec_from_file_location(name, loader=loader)\n' ' search = [] if is_package else None\n' ' return spec_from_file_location(name, loader=loader,\n' ' submodule_search_locations=search)\n' '\n' ' if is_package is None:\n' " if hasattr(loader, 'is_package'):\n" ' try:\n' ' is_package = loader.is_package(name)\n' ' except ImportError:\n' ' is_package = None # aka, undefined\n' ' else:\n' ' # the default\n' ' is_package = False\n' '\n' ' return ModuleSpec(name, loader, origin=origin, is_package=is_package)\n' '\n' '\n' 'def _spec_from_module(module, loader=None, origin=None):\n' ' # This function is meant for use in _setup().\n' ' try:\n' ' spec = module.__spec__\n' ' except AttributeError:\n' ' pass\n' ' else:\n' ' if spec is not None:\n' ' return spec\n' '\n' ' name = module.__name__\n' ' if loader is None:\n' ' try:\n' ' loader = module.__loader__\n' ' except AttributeError:\n' ' # loader will stay None.\n' ' pass\n' ' try:\n' ' location = module.__file__\n' ' except AttributeError:\n' ' location = None\n' ' if origin is None:\n' ' if location is None:\n' ' try:\n' ' origin = loader._ORIGIN\n' ' except AttributeError:\n' ' origin = None\n' ' else:\n' ' origin = location\n' ' try:\n' ' cached = module.__cached__\n' ' except AttributeError:\n' ' cached = None\n' ' try:\n' ' submodule_search_locations = list(module.__path__)\n' ' except AttributeError:\n' ' submodule_search_locations = None\n' '\n' ' spec = ModuleSpec(name, loader, origin=origin)\n' ' spec._set_fileattr = False if location is None else True\n' ' spec.cached = cached\n' ' spec.submodule_search_locations = submodule_search_locations\n' ' return spec\n' '\n' '\n' 'def _init_module_attrs(spec, module, *, override=False):\n' ' # The passed-in module may be not support attribute assignment,\n' " # in which case we simply don't set the attributes.\n" ' # __name__\n' " if (override or getattr(module, '__name__', None) is None):\n" ' try:\n' ' module.__name__ = spec.name\n' ' except AttributeError:\n' ' pass\n' ' # __loader__\n' " if override or getattr(module, '__loader__', None) is None:\n" ' loader = spec.loader\n' ' if loader is None:\n' ' # A backward compatibility hack.\n' ' if spec.submodule_search_locations is not None:\n' ' if _bootstrap_external is None:\n' ' raise NotImplementedError\n' ' _NamespaceLoader = _bootstrap_external._NamespaceLoader\n' '\n' ' loader = _NamespaceLoader.__new__(_NamespaceLoader)\n' ' loader._path = spec.submodule_search_locations\n' ' spec.loader = loader\n' ' # While the docs say that module.__file__ is not set for\n' ' # built-in modules, and the code below will avoid setting it ' 'if\n' ' # spec.has_location is false, this is incorrect for ' 'namespace\n' ' # packages. Namespace packages have no location, but their\n' ' # __spec__.origin is None, and thus their module.__file__\n' ' # should also be None for consistency. While a bit of a ' 'hack,\n' ' # this is the best place to ensure this consistency.\n' ' #\n' ' # See # ' 'https://docs.python.org/3/library/importlib.html#importlib.abc.Loader.load_module\n' ' # and bpo-32305\n' ' module.__file__ = None\n' ' try:\n' ' module.__loader__ = loader\n' ' except AttributeError:\n' ' pass\n' ' # __package__\n' " if override or getattr(module, '__package__', None) is None:\n" ' try:\n' ' module.__package__ = spec.parent\n' ' except AttributeError:\n' ' pass\n' ' # __spec__\n' ' try:\n' ' module.__spec__ = spec\n' ' except AttributeError:\n' ' pass\n' ' # __path__\n' " if override or getattr(module, '__path__', None) is None:\n" ' if spec.submodule_search_locations is not None:\n' ' try:\n' ' module.__path__ = spec.submodule_search_locations\n' ' except AttributeError:\n' ' pass\n' ' # __file__/__cached__\n' ' if spec.has_location:\n' " if override or getattr(module, '__file__', None) is None:\n" ' try:\n' ' module.__file__ = spec.origin\n' ' except AttributeError:\n' ' pass\n' '\n' " if override or getattr(module, '__cached__', None) is None:\n" ' if spec.cached is not None:\n' ' try:\n' ' module.__cached__ = spec.cached\n' ' except AttributeError:\n' ' pass\n' ' return module\n' '\n' '\n' 'def module_from_spec(spec):\n' ' """Create a module based on the provided spec."""\n' ' # Typically loaders will not implement create_module().\n' ' module = None\n' " if hasattr(spec.loader, 'create_module'):\n" ' # If create_module() returns `None` then it means default\n' ' # module creation should be used.\n' ' module = spec.loader.create_module(spec)\n' " elif hasattr(spec.loader, 'exec_module'):\n" " raise ImportError('loaders that define exec_module() '\n" " 'must also define create_module()')\n" ' if module is None:\n' ' module = _new_module(spec.name)\n' ' _init_module_attrs(spec, module)\n' ' return module\n' '\n' '\n' 'def _module_repr_from_spec(spec):\n' ' """Return the repr to use for the module."""\n' ' # We mostly replicate _module_repr() using the spec attributes.\n' " name = '?' if spec.name is None else spec.name\n" ' if spec.origin is None:\n' ' if spec.loader is None:\n' " return ''.format(name)\n" ' else:\n' " return ''.format(name, spec.loader)\n" ' else:\n' ' if spec.has_location:\n' " return ''.format(name, spec.origin)\n" ' else:\n' " return ''.format(spec.name, spec.origin)\n" '\n' '\n' '# Used by importlib.reload() and _load_module_shim().\n' 'def _exec(spec, module):\n' ' """Execute the spec\'s specified module in an existing module\'s ' 'namespace."""\n' ' name = spec.name\n' ' with _ModuleLockManager(name):\n' ' if sys.modules.get(name) is not module:\n' " msg = 'module {!r} not in sys.modules'.format(name)\n" ' raise ImportError(msg, name=name)\n' ' try:\n' ' if spec.loader is None:\n' ' if spec.submodule_search_locations is None:\n' " raise ImportError('missing loader', name=spec.name)\n" ' # Namespace package.\n' ' _init_module_attrs(spec, module, override=True)\n' ' else:\n' ' _init_module_attrs(spec, module, override=True)\n' " if not hasattr(spec.loader, 'exec_module'):\n" ' # (issue19713) Once BuiltinImporter and ' 'ExtensionFileLoader\n' ' # have exec_module() implemented, we can add a ' 'deprecation\n' ' # warning here.\n' ' spec.loader.load_module(name)\n' ' else:\n' ' spec.loader.exec_module(module)\n' ' finally:\n' ' # Update the order of insertion into sys.modules for module\n' ' # clean-up at shutdown.\n' ' module = sys.modules.pop(spec.name)\n' ' sys.modules[spec.name] = module\n' ' return module\n' '\n' '\n' 'def _load_backward_compatible(spec):\n' ' # (issue19713) Once BuiltinImporter and ExtensionFileLoader\n' ' # have exec_module() implemented, we can add a deprecation\n' ' # warning here.\n' ' try:\n' ' spec.loader.load_module(spec.name)\n' ' except:\n' ' if spec.name in sys.modules:\n' ' module = sys.modules.pop(spec.name)\n' ' sys.modules[spec.name] = module\n' ' raise\n' ' # The module must be in sys.modules at this point!\n' ' # Move it to the end of sys.modules.\n' ' module = sys.modules.pop(spec.name)\n' ' sys.modules[spec.name] = module\n' " if getattr(module, '__loader__', None) is None:\n" ' try:\n' ' module.__loader__ = spec.loader\n' ' except AttributeError:\n' ' pass\n' " if getattr(module, '__package__', None) is None:\n" ' try:\n' ' # Since module.__path__ may not line up with\n' " # spec.submodule_search_paths, we can't necessarily rely\n" ' # on spec.parent here.\n' ' module.__package__ = module.__name__\n' " if not hasattr(module, '__path__'):\n" " module.__package__ = spec.name.rpartition('.')[0]\n" ' except AttributeError:\n' ' pass\n' " if getattr(module, '__spec__', None) is None:\n" ' try:\n' ' module.__spec__ = spec\n' ' except AttributeError:\n' ' pass\n' ' return module\n' '\n' 'def _load_unlocked(spec):\n' ' # A helper for direct use by the import system.\n' ' if spec.loader is not None:\n' ' # Not a namespace package.\n' " if not hasattr(spec.loader, 'exec_module'):\n" ' return _load_backward_compatible(spec)\n' '\n' ' module = module_from_spec(spec)\n' '\n' ' # This must be done before putting the module in sys.modules\n' ' # (otherwise an optimization shortcut in import.c becomes\n' ' # wrong).\n' ' spec._initializing = True\n' ' try:\n' ' sys.modules[spec.name] = module\n' ' try:\n' ' if spec.loader is None:\n' ' if spec.submodule_search_locations is None:\n' " raise ImportError('missing loader', name=spec.name)\n" ' # A namespace package so do nothing.\n' ' else:\n' ' spec.loader.exec_module(module)\n' ' except:\n' ' try:\n' ' del sys.modules[spec.name]\n' ' except KeyError:\n' ' pass\n' ' raise\n' ' # Move the module to the end of sys.modules.\n' " # We don't ensure that the import-related module attributes get\n" ' # set in the sys.modules replacement case. Such modules are on\n' ' # their own.\n' ' module = sys.modules.pop(spec.name)\n' ' sys.modules[spec.name] = module\n' " _verbose_message('import {!r} # {!r}', spec.name, spec.loader)\n" ' finally:\n' ' spec._initializing = False\n' '\n' ' return module\n' '\n' '# A method used during testing of _load_unlocked() and by\n' '# _load_module_shim().\n' 'def _load(spec):\n' ' """Return a new module object, loaded by the spec\'s loader.\n' '\n' ' The module is not added to its parent.\n' '\n' ' If a module is already in sys.modules, that existing module gets\n' ' clobbered.\n' '\n' ' """\n' ' with _ModuleLockManager(spec.name):\n' ' return _load_unlocked(spec)\n' '\n' '\n' '# Loaders ' '#####################################################################\n' '\n' 'class BuiltinImporter:\n' '\n' ' """Meta path import for built-in modules.\n' '\n' ' All methods are either class or static methods to avoid the need to\n' ' instantiate the class.\n' '\n' ' """\n' '\n' ' @staticmethod\n' ' def module_repr(module):\n' ' """Return repr for the module.\n' '\n' ' The method is deprecated. The import machinery does the job ' 'itself.\n' '\n' ' """\n' " return ''.format(module.__name__)\n" '\n' ' @classmethod\n' ' def find_spec(cls, fullname, path=None, target=None):\n' ' if path is not None:\n' ' return None\n' ' if _imp.is_builtin(fullname):\n' " return spec_from_loader(fullname, cls, origin='built-in')\n" ' else:\n' ' return None\n' '\n' ' @classmethod\n' ' def find_module(cls, fullname, path=None):\n' ' """Find the built-in module.\n' '\n' " If 'path' is ever specified then the search is considered a " 'failure.\n' '\n' ' This method is deprecated. Use find_spec() instead.\n' '\n' ' """\n' ' spec = cls.find_spec(fullname, path)\n' ' return spec.loader if spec is not None else None\n' '\n' ' @classmethod\n' ' def create_module(self, spec):\n' ' """Create a built-in module"""\n' ' if spec.name not in sys.builtin_module_names:\n' " raise ImportError('{!r} is not a built-in " "module'.format(spec.name),\n" ' name=spec.name)\n' ' return _call_with_frames_removed(_imp.create_builtin, spec)\n' '\n' ' @classmethod\n' ' def exec_module(self, module):\n' ' """Exec a built-in module"""\n' ' _call_with_frames_removed(_imp.exec_builtin, module)\n' '\n' ' @classmethod\n' ' @_requires_builtin\n' ' def get_code(cls, fullname):\n' ' """Return None as built-in modules do not have code objects."""\n' ' return None\n' '\n' ' @classmethod\n' ' @_requires_builtin\n' ' def get_source(cls, fullname):\n' ' """Return None as built-in modules do not have source code."""\n' ' return None\n' '\n' ' @classmethod\n' ' @_requires_builtin\n' ' def is_package(cls, fullname):\n' ' """Return False as built-in modules are never packages."""\n' ' return False\n' '\n' ' load_module = classmethod(_load_module_shim)\n' '\n' '\n' 'class FrozenImporter:\n' '\n' ' """Meta path import for frozen modules.\n' '\n' ' All methods are either class or static methods to avoid the need to\n' ' instantiate the class.\n' '\n' ' """\n' '\n' ' _ORIGIN = "frozen"\n' '\n' ' @staticmethod\n' ' def module_repr(m):\n' ' """Return repr for the module.\n' '\n' ' The method is deprecated. The import machinery does the job ' 'itself.\n' '\n' ' """\n' " return ''.format(m.__name__, " 'FrozenImporter._ORIGIN)\n' '\n' ' @classmethod\n' ' def find_spec(cls, fullname, path=None, target=None):\n' ' if _imp.is_frozen(fullname):\n' ' return spec_from_loader(fullname, cls, origin=cls._ORIGIN)\n' ' else:\n' ' return None\n' '\n' ' @classmethod\n' ' def find_module(cls, fullname, path=None):\n' ' """Find a frozen module.\n' '\n' ' This method is deprecated. Use find_spec() instead.\n' '\n' ' """\n' ' return cls if _imp.is_frozen(fullname) else None\n' '\n' ' @classmethod\n' ' def create_module(cls, spec):\n' ' """Use default semantics for module creation."""\n' '\n' ' @staticmethod\n' ' def exec_module(module):\n' ' name = module.__spec__.name\n' ' if not _imp.is_frozen(name):\n' " raise ImportError('{!r} is not a frozen module'.format(name),\n" ' name=name)\n' ' code = _call_with_frames_removed(_imp.get_frozen_object, name)\n' ' exec(code, module.__dict__)\n' '\n' ' @classmethod\n' ' def load_module(cls, fullname):\n' ' """Load a frozen module.\n' '\n' ' This method is deprecated. Use exec_module() instead.\n' '\n' ' """\n' ' return _load_module_shim(cls, fullname)\n' '\n' ' @classmethod\n' ' @_requires_frozen\n' ' def get_code(cls, fullname):\n' ' """Return the code object for the frozen module."""\n' ' return _imp.get_frozen_object(fullname)\n' '\n' ' @classmethod\n' ' @_requires_frozen\n' ' def get_source(cls, fullname):\n' ' """Return None as frozen modules do not have source code."""\n' ' return None\n' '\n' ' @classmethod\n' ' @_requires_frozen\n' ' def is_package(cls, fullname):\n' ' """Return True if the frozen module is a package."""\n' ' return _imp.is_frozen_package(fullname)\n' '\n' '\n' '# Import itself ' '###############################################################\n' '\n' 'class _ImportLockContext:\n' '\n' ' """Context manager for the import lock."""\n' '\n' ' def __enter__(self):\n' ' """Acquire the import lock."""\n' ' _imp.acquire_lock()\n' '\n' ' def __exit__(self, exc_type, exc_value, exc_traceback):\n' ' """Release the import lock regardless of any raised exceptions."""\n' ' _imp.release_lock()\n' '\n' '\n' 'def _resolve_name(name, package, level):\n' ' """Resolve a relative module name to an absolute one."""\n' " bits = package.rsplit('.', level - 1)\n" ' if len(bits) < level:\n' " raise ValueError('attempted relative import beyond top-level " "package')\n" ' base = bits[0]\n' " return '{}.{}'.format(base, name) if name else base\n" '\n' '\n' 'def _find_spec_legacy(finder, name, path):\n' ' # This would be a good place for a DeprecationWarning if\n' ' # we ended up going that route.\n' ' loader = finder.find_module(name, path)\n' ' if loader is None:\n' ' return None\n' ' return spec_from_loader(name, loader)\n' '\n' '\n' 'def _find_spec(name, path, target=None):\n' ' """Find a module\'s spec."""\n' ' meta_path = sys.meta_path\n' ' if meta_path is None:\n' ' # PyImport_Cleanup() is running or has been called.\n' ' raise ImportError("sys.meta_path is None, Python is likely "\n' ' "shutting down")\n' '\n' ' if not meta_path:\n' " _warnings.warn('sys.meta_path is empty', ImportWarning)\n" '\n' ' # We check sys.modules here for the reload case. While a passed-in\n' ' # target will usually indicate a reload there is no guarantee, whereas\n' ' # sys.modules provides one.\n' ' is_reload = name in sys.modules\n' ' for finder in meta_path:\n' ' with _ImportLockContext():\n' ' try:\n' ' find_spec = finder.find_spec\n' ' except AttributeError:\n' ' spec = _find_spec_legacy(finder, name, path)\n' ' if spec is None:\n' ' continue\n' ' else:\n' ' spec = find_spec(name, path, target)\n' ' if spec is not None:\n' ' # The parent import may have already imported this module.\n' ' if not is_reload and name in sys.modules:\n' ' module = sys.modules[name]\n' ' try:\n' ' __spec__ = module.__spec__\n' ' except AttributeError:\n' ' # We use the found spec since that is the one that\n' " # we would have used if the parent module hadn't\n" ' # beaten us to the punch.\n' ' return spec\n' ' else:\n' ' if __spec__ is None:\n' ' return spec\n' ' else:\n' ' return __spec__\n' ' else:\n' ' return spec\n' ' else:\n' ' return None\n' '\n' '\n' 'def _sanity_check(name, package, level):\n' ' """Verify arguments are "sane"."""\n' ' if not isinstance(name, str):\n' " raise TypeError('module name must be str, not " "{}'.format(type(name)))\n" ' if level < 0:\n' " raise ValueError('level must be >= 0')\n" ' if level > 0:\n' ' if not isinstance(package, str):\n' " raise TypeError('__package__ not set to a string')\n" ' elif not package:\n' " raise ImportError('attempted relative import with no known " "parent '\n" " 'package')\n" ' if not name and level == 0:\n' " raise ValueError('Empty module name')\n" '\n' '\n' "_ERR_MSG_PREFIX = 'No module named '\n" "_ERR_MSG = _ERR_MSG_PREFIX + '{!r}'\n" '\n' 'def _find_and_load_unlocked(name, import_):\n' ' path = None\n' " parent = name.rpartition('.')[0]\n" ' if parent:\n' ' if parent not in sys.modules:\n' ' _call_with_frames_removed(import_, parent)\n' ' # Crazy side-effects!\n' ' if name in sys.modules:\n' ' return sys.modules[name]\n' ' parent_module = sys.modules[parent]\n' ' try:\n' ' path = parent_module.__path__\n' ' except AttributeError:\n' " msg = (_ERR_MSG + '; {!r} is not a package').format(name, " 'parent)\n' ' raise ModuleNotFoundError(msg, name=name) from None\n' ' spec = _find_spec(name, path)\n' ' if spec is None:\n' ' raise ModuleNotFoundError(_ERR_MSG.format(name), name=name)\n' ' else:\n' ' module = _load_unlocked(spec)\n' ' if parent:\n' ' # Set the module as an attribute on its parent.\n' ' parent_module = sys.modules[parent]\n' " setattr(parent_module, name.rpartition('.')[2], module)\n" ' return module\n' '\n' '\n' '_NEEDS_LOADING = object()\n' '\n' '\n' 'def _find_and_load(name, import_):\n' ' """Find and load the module."""\n' ' with _ModuleLockManager(name):\n' ' module = sys.modules.get(name, _NEEDS_LOADING)\n' ' if module is _NEEDS_LOADING:\n' ' return _find_and_load_unlocked(name, import_)\n' '\n' ' if module is None:\n' " message = ('import of {} halted; '\n" " 'None in sys.modules'.format(name))\n" ' raise ModuleNotFoundError(message, name=name)\n' '\n' ' _lock_unlock_module(name)\n' ' return module\n' '\n' '\n' 'def _gcd_import(name, package=None, level=0):\n' ' """Import and return the module based on its name, the package the call ' 'is\n' ' being made from, and the level adjustment.\n' '\n' ' This function represents the greatest common denominator of ' 'functionality\n' ' between import_module and __import__. This includes setting __package__ ' 'if\n' ' the loader did not.\n' '\n' ' """\n' ' _sanity_check(name, package, level)\n' ' if level > 0:\n' ' name = _resolve_name(name, package, level)\n' ' return _find_and_load(name, _gcd_import)\n' '\n' '\n' 'def _handle_fromlist(module, fromlist, import_, *, recursive=False):\n' ' """Figure out what __import__ should return.\n' '\n' ' The import_ parameter is a callable which takes the name of module to\n' ' import. It is required to decouple the function from assuming ' "importlib's\n" ' import implementation is desired.\n' '\n' ' """\n' ' # The hell that is fromlist ...\n' ' # If a package was imported, try to import stuff from fromlist.\n' ' for x in fromlist:\n' ' if not isinstance(x, str):\n' ' if recursive:\n' " where = module.__name__ + '.__all__'\n" ' else:\n' ' where = "``from list\'\'"\n' ' raise TypeError(f"Item in {where} must be str, "\n' ' f"not {type(x).__name__}")\n' " elif x == '*':\n" " if not recursive and hasattr(module, '__all__'):\n" ' _handle_fromlist(module, module.__all__, import_,\n' ' recursive=True)\n' ' elif not hasattr(module, x):\n' " from_name = '{}.{}'.format(module.__name__, x)\n" ' try:\n' ' _call_with_frames_removed(import_, from_name)\n' ' except ModuleNotFoundError as exc:\n' ' # Backwards-compatibility dictates we ignore failed\n' " # imports triggered by fromlist for modules that don't\n" ' # exist.\n' ' if (exc.name == from_name and\n' ' sys.modules.get(from_name, _NEEDS_LOADING) is not ' 'None):\n' ' continue\n' ' raise\n' ' return module\n' '\n' '\n' 'def _calc___package__(globals):\n' ' """Calculate what __package__ should be.\n' '\n' ' __package__ is not guaranteed to be defined or could be set to None\n' ' to represent that its proper value is unknown.\n' '\n' ' """\n' " package = globals.get('__package__')\n" " spec = globals.get('__spec__')\n" ' if package is not None:\n' ' if spec is not None and package != spec.parent:\n' ' _warnings.warn("__package__ != __spec__.parent "\n' ' f"({package!r} != {spec.parent!r})",\n' ' ImportWarning, stacklevel=3)\n' ' return package\n' ' elif spec is not None:\n' ' return spec.parent\n' ' else:\n' ' _warnings.warn("can\'t resolve package from __spec__ or __package__, ' '"\n' ' "falling back on __name__ and __path__",\n' ' ImportWarning, stacklevel=3)\n' " package = globals['__name__']\n" " if '__path__' not in globals:\n" " package = package.rpartition('.')[0]\n" ' return package\n' '\n' '\n' 'def __import__(name, globals=None, locals=None, fromlist=(), level=0):\n' ' """Import a module.\n' '\n' " The 'globals' argument is used to infer where the import is occurring " 'from\n' " to handle relative imports. The 'locals' argument is ignored. The\n" " 'fromlist' argument specifies what should exist as attributes on the " 'module\n' " being imported (e.g. ``from module import ``). The 'level'\n" ' argument represents the package location to import from in a relative\n' " import (e.g. ``from ..pkg import mod`` would have a 'level' of 2).\n" '\n' ' """\n' ' if level == 0:\n' ' module = _gcd_import(name)\n' ' else:\n' ' globals_ = globals if globals is not None else {}\n' ' package = _calc___package__(globals_)\n' ' module = _gcd_import(name, package, level)\n' ' if not fromlist:\n' " # Return up to the first dot in 'name'. This is complicated by the " 'fact\n' " # that 'name' may be relative.\n" ' if level == 0:\n' " return _gcd_import(name.partition('.')[0])\n" ' elif not name:\n' ' return module\n' ' else:\n' " # Figure out where to slice the module's name up to the first " 'dot\n' " # in 'name'.\n" " cut_off = len(name) - len(name.partition('.')[0])\n" ' # Slice end needs to be positive to alleviate need to ' 'special-case\n' " # when ``'.' not in name``.\n" ' return ' 'sys.modules[module.__name__[:len(module.__name__)-cut_off]]\n' " elif hasattr(module, '__path__'):\n" ' return _handle_fromlist(module, fromlist, _gcd_import)\n' ' else:\n' ' return module\n' '\n' '\n' 'def _builtin_from_name(name):\n' ' spec = BuiltinImporter.find_spec(name)\n' ' if spec is None:\n' " raise ImportError('no built-in module named ' + name)\n" ' return _load_unlocked(spec)\n' '\n' '\n' 'def _setup(sys_module, _imp_module):\n' ' """Setup importlib by importing needed built-in modules and injecting ' 'them\n' ' into the global namespace.\n' '\n' ' As sys is needed for sys.modules access and _imp is needed to load ' 'built-in\n' ' modules, those two modules must be explicitly passed in.\n' '\n' ' """\n' ' global _imp, sys\n' ' _imp = _imp_module\n' ' sys = sys_module\n' '\n' ' # Set up the spec for existing builtin/frozen modules.\n' ' module_type = type(sys)\n' ' for name, module in sys.modules.items():\n' ' if isinstance(module, module_type):\n' ' if name in sys.builtin_module_names:\n' ' loader = BuiltinImporter\n' ' elif _imp.is_frozen(name):\n' ' loader = FrozenImporter\n' ' else:\n' ' continue\n' ' spec = _spec_from_module(module, loader)\n' ' _init_module_attrs(spec, module)\n' '\n' ' # Directly load built-in modules needed during bootstrap.\n' ' self_module = sys.modules[__name__]\n' " for builtin_name in ('_thread', '_warnings', '_weakref'):\n" ' if builtin_name not in sys.modules:\n' ' builtin_module = _builtin_from_name(builtin_name)\n' ' else:\n' ' builtin_module = sys.modules[builtin_name]\n' ' setattr(self_module, builtin_name, builtin_module)\n' '\n' '\n' 'def _install(sys_module, _imp_module):\n' ' """Install importers for builtin and frozen modules"""\n' ' _setup(sys_module, _imp_module)\n' '\n' ' sys.meta_path.append(BuiltinImporter)\n' ' sys.meta_path.append(FrozenImporter)\n' '\n' '\n' 'def _install_external_importers():\n' ' """Install importers that require external filesystem access"""\n' ' global _bootstrap_external\n' ' import _frozen_importlib_external\n' ' _bootstrap_external = _frozen_importlib_external\n' ' _frozen_importlib_external._install(sys.modules[__name__])\n') verify = True tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Func[104 chars] 27,\n 0,\n 27,\n 9],\n de[2065 chars]))])" != "Func[104 chars] 2,\n 0,\n 2,\n 9],\n deco[2063 chars]))])" E FunctionDef( E name='_wrap', E doc='Simple substitute for functools.update_wrapper.', E position=[ E - 27, E ? - E + 2, E 0, E - 27, E ? - E + 2, E 9], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[Name(name='new'), Name(name='old')], E defaults=[], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[None, None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[None, None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[For( E target=Name(name='replace'), E iter=List( E ctx=, E elts=[ E Const( E value='__module__', E kind=None), E Const( E value='__name__', E kind=None), E Const( E value='__qualname__', E kind=None), E Const( E value='__doc__', E kind=None)]), E body=[If( E test=Call( E func=Name(name='hasattr'), E args=[Name(name='old'), Name(name='replace')], E keywords=[]), E body=[Expr(value=Call( E func=Name(name='setattr'), E args=[ E Name(name='new'), E Name(name='replace'), E Call( E func=Name(name='getattr'), E args=[Name(name='old'), Name(name='replace')], E keywords=[])], E keywords=[]))], E orelse=[])], E orelse=[]), E Expr(value=Call( E func=Attribute( E attrname='update', E expr=Attribute( E attrname='__dict__', E expr=Name(name='new'))), E args=[Attribute( E attrname='__dict__', E expr=Name(name='old'))], E keywords=[]))]) node = rebuilt_node = self = test_case = tested_nodes = 4 text = ('def _wrap(new, old):\n' ' """Simple substitute for functools.update_wrapper."""\n' " for replace in ['__module__', '__name__', '__qualname__', '__doc__']:\n" ' if hasattr(old, replace):\n' ' setattr(new, replace, getattr(old, replace))\n' ' new.__dict__.update(old.__dict__)') ----------------------------- Captured stdout call ----------------------------- /usr/lib/python3.8/importlib/_bootstrap.py ___________________________ TestAstroid.test_tuples ____________________________ self = test_case = def verify_all_nodes(self, test_case): """ Generically test atok.get_text() on the ast tree: for each statement and expression in the tree, we extract the text, parse it, and see if it produces an equivalent tree. Returns the number of nodes that were tested this way. """ test_case.longMessage = True tested_nodes = 0 for node in self.all_nodes: if not ( util.is_stmt(node) or util.is_expr(node) or util.is_module(node) # In 3.9+, slices are now expressions in the AST, but of course their source code # can't be parsed ) or util.is_slice(node): continue text = self.atok.get_text(node) # await is not allowed outside async functions below 3.7 # parsing again would give a syntax error if 'await' in text and 'async def' not in text and sys.version_info < (3, 7): continue # `elif:` is really just `else: if:` to the AST, # so get_text can return text starting with elif when given an If node. # This is generally harmless and there's probably no good alternative, # but in isolation it's invalid syntax text = re.sub(r'^(\s*)elif(\W)', r'\1if\2', text, re.MULTILINE) rebuilt_node = test_case.parse_snippet(text, node) try: > test_case.assert_nodes_equal(node, rebuilt_node) node = rebuilt_node = self = test_case = tested_nodes = 1 text = 'def foo(a=()): ((x, (y,)),) = ((), (a,),),' tests/tools.py:99: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = t2 = def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): > self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) context_classes_group = (, , ) self = t1 = t2 = tests/test_mark_tokens.py:790: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'foo'), ('doc', None), ...] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'foo'), ('doc', None), ...] def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'foo'), ('doc', None), ('position', Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=7))] t2 = [('decorators', None), ('args', ), ('returns', None), ('body', []), ('name', 'foo'), ('doc', None), ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7))] vc1 = ('position', Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=7)) vc2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7)) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = ('position', Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=7)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7)) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = ('position', Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=7)) t2 = ('position', Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7)) vc1 = Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=7) vc2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7) tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = t1 = Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=7) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7) def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): > self.assert_nodes_equal(vc1, vc2) context_classes_group = (, , ) self = t1 = Position(lineno=1, col_offset=0, end_lineno=1, end_col_offset=7) t2 = Position(lineno=2, col_offset=0, end_lineno=2, end_col_offset=7) vc1 = 1 vc2 = 2 tests/test_mark_tokens.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , t1 = 1, t2 = 2 def assert_nodes_equal(self, t1, t2): # Ignore the context of each node which can change when parsing # substrings of source code. We just want equal structure and contents. for context_classes_group in self.context_classes: if isinstance(t1, context_classes_group): self.assertIsInstance(t2, context_classes_group) break else: self.assertEqual(type(t1), type(t2)) if isinstance(t1, (list, tuple)): self.assertEqual(len(t1), len(t2)) for vc1, vc2 in zip(t1, t2): self.assert_nodes_equal(vc1, vc2) elif isinstance(t1, self.nodes_classes): self.assert_nodes_equal( list(self.iter_fields(t1)), list(self.iter_fields(t2)), ) else: # Weird bug in astroid that collapses spaces in docstrings sometimes maybe if self.is_astroid_test and isinstance(t1, six.string_types): t1 = re.sub(r'^ +$', '', t1, flags=re.MULTILINE) t2 = re.sub(r'^ +$', '', t2, flags=re.MULTILINE) > self.assertEqual(t1, t2) context_classes_group = (, , ) self = t1 = 1 t2 = 2 tests/test_mark_tokens.py:800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 1 second = 2, msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) assertion_func = > first = 1 msg = None second = 2 self = /usr/lib/python3.8/unittest/case.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , first = 1 second = 2, msg = '1 != 2' def _baseAssertEqual(self, first, second, msg=None): """The default assertEqual implementation, not type specific.""" if not first == second: standardMsg = '%s != %s' % _common_shorten_repr(first, second) msg = self._formatMessage(msg, standardMsg) > raise self.failureException(msg) E AssertionError: 1 != 2 first = 1 msg = '1 != 2' second = 2 self = standardMsg = '1 != 2' /usr/lib/python3.8/unittest/case.py:905: AssertionError During handling of the above exception, another exception occurred: self = def test_tuples(self): def get_tuples(code): m = self.create_mark_checker(code) return [m.atok.get_text(n) for n in m.all_nodes if n.__class__.__name__ == "Tuple"] self.assertEqual(get_tuples("a,"), ["a,"]) self.assertEqual(get_tuples("(a,)"), ["(a,)"]) self.assertEqual(get_tuples("(a),"), ["(a),"]) self.assertEqual(get_tuples("((a),)"), ["((a),)"]) self.assertEqual(get_tuples("(a,),"), ["(a,),", "(a,)"]) self.assertEqual(get_tuples("((a,),)"), ["((a,),)", "(a,)"]) self.assertEqual(get_tuples("()"), ["()"]) self.assertEqual(get_tuples("(),"), ["(),", "()"]) self.assertEqual(get_tuples("((),)"), ["((),)", "()"]) self.assertEqual(get_tuples("((),(a,))"), ["((),(a,))", "()", "(a,)"]) self.assertEqual(get_tuples("((),(a,),)"), ["((),(a,),)", "()", "(a,)"]) self.assertEqual(get_tuples("((),(a,),),"), ["((),(a,),),", "((),(a,),)", "()", "(a,)"]) self.assertEqual(get_tuples('((foo, bar),)'), ['((foo, bar),)', '(foo, bar)']) self.assertEqual(get_tuples('(foo, bar),'), ['(foo, bar),', '(foo, bar)']) > self.assertEqual(get_tuples('def foo(a=()): ((x, (y,)),) = ((), (a,),),'), [ '()', '((x, (y,)),)', '(x, (y,))', '(y,)', '((), (a,),),', '((), (a,),)', '()', '(a,)']) get_tuples = .get_tuples at 0x7f812b15c4c0> self = tests/test_mark_tokens.py:416: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_mark_tokens.py:399: in get_tuples m = self.create_mark_checker(code) code = 'def foo(a=()): ((x, (y,)),) = ((), (a,),),' self = tests/test_mark_tokens.py:43: in create_mark_checker checker.verify_all_nodes(self) atok = checker = node = self = source = 'def foo(a=()): ((x, (y,)),) = ((), (a,),),' verify = True tests/tools.py:103: in verify_all_nodes test_case.assertEqual( E AssertionError: "Func[56 chars] 1,\n 0,\n 1,\n 7],\n dec[1218 chars]))])" != "Func[56 chars] 2,\n 0,\n 2,\n 7],\n dec[1218 chars]))])" E FunctionDef( E name='foo', E doc=None, E position=[ E - 1, E ? ^ E + 2, E ? ^ E 0, E - 1, E ? ^ E + 2, E ? ^ E 7], E decorators=None, E args=Arguments( E vararg=None, E kwarg=None, E args=[Name(name='a')], E defaults=[Tuple( E ctx=, E elts=[])], E kwonlyargs=[], E posonlyargs=[], E posonlyargs_annotations=[], E kw_defaults=[], E annotations=[None], E varargannotation=None, E kwargannotation=None, E kwonlyargs_annotations=[], E type_comment_args=[None], E type_comment_kwonlyargs=[], E type_comment_posonlyargs=[]), E returns=None, E body=[Assign( E targets=[Tuple( E ctx=, E elts=[Tuple( E ctx=, E elts=[Name(name='x'), Tuple( E ctx=, E elts=[Name(name='y')])])])], E value=Tuple( E ctx=, E elts=[Tuple( E ctx=, E elts=[Tuple( E ctx=, E elts=[]), E Tuple( E ctx=, E elts=[Name(name='a')])])]))]) node = rebuilt_node = self = test_case = tested_nodes = 1 text = 'def foo(a=()): ((x, (y,)),) = ((), (a,),),' =========================== short test summary info ============================ SKIPPED [1] tests/test_mark_tokens.py:179: astroid-2.0 does not support this FAILED tests/test_astroid.py::TestAstroid::test_assignment_expressions - Asse... FAILED tests/test_astroid.py::TestAstroid::test_async_def - AssertionError: "... FAILED tests/test_astroid.py::TestAstroid::test_decorators - AssertionError: ... FAILED tests/test_astroid.py::TestAstroid::test_fixture10 - AssertionError: "... FAILED tests/test_astroid.py::TestAstroid::test_fixture11 - AssertionError: "... FAILED tests/test_astroid.py::TestAstroid::test_fixture13 - AssertionError: "... FAILED tests/test_astroid.py::TestAstroid::test_fixture3 - AssertionError: "C... FAILED tests/test_astroid.py::TestAstroid::test_fixture4 - AssertionError: "C... FAILED tests/test_astroid.py::TestAstroid::test_fixture5 - AssertionError: "C... FAILED tests/test_astroid.py::TestAstroid::test_fixture7 - AssertionError: "F... FAILED tests/test_astroid.py::TestAstroid::test_fixture8 - AssertionError: "F... FAILED tests/test_astroid.py::TestAstroid::test_fixture9 - AssertionError: "C... FAILED tests/test_astroid.py::TestAstroid::test_fstrings - AssertionError: "F... FAILED tests/test_astroid.py::TestAstroid::test_mark_tokens_simple - Assertio... FAILED tests/test_astroid.py::TestAstroid::test_print_function - AssertionErr... FAILED tests/test_astroid.py::TestAstroid::test_splat - AssertionError: "Func... FAILED tests/test_astroid.py::TestAstroid::test_sys_modules - AssertionError:... FAILED tests/test_astroid.py::TestAstroid::test_tuples - AssertionError: "Fun... ====== 18 failed, 87 passed, 1 skipped, 1 deselected, 1 warning in 9.72s ======= * ERROR: dev-python/asttokens-2.0.5::gentoo failed (test phase): * pytest failed with python3.8 * * Call stack: * ebuild.sh, line 127: Called src_test * environment, line 3203: Called distutils-r1_src_test * environment, line 1482: Called _distutils-r1_run_foreach_impl 'python_test' * environment, line 601: Called python_foreach_impl 'distutils-r1_run_phase' 'python_test' * environment, line 2878: Called multibuild_foreach_variant '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' * environment, line 2372: Called _multibuild_run '_python_multibuild_wrapper' 'distutils-r1_run_phase' 'python_test' * environment, line 2370: Called _python_multibuild_wrapper 'distutils-r1_run_phase' 'python_test' * environment, line 934: Called distutils-r1_run_phase 'python_test' * environment, line 1409: Called python_test * environment, line 3170: Called epytest '--deselect' 'tests/test_astroid.py::TestAstroid::test_slices' * environment, line 1883: Called die * The specific snippet of code: * "${@}" || die -n "pytest failed with ${EPYTHON}"; * * If you need support, post the output of `emerge --info '=dev-python/asttokens-2.0.5::gentoo'`, * the complete build log and the output of `emerge -pqv '=dev-python/asttokens-2.0.5::gentoo'`. * The complete build log is located at '/var/log/emerge-log/build/dev-python/asttokens-2.0.5:20220304-042250.log'. * For convenience, a symlink to the build log is located at '/var/tmp/portage/dev-python/asttokens-2.0.5/temp/build.log'. * The ebuild environment file is located at '/var/tmp/portage/dev-python/asttokens-2.0.5/temp/environment'. * Working directory: '/var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5' * S: '/var/tmp/portage/dev-python/asttokens-2.0.5/work/asttokens-2.0.5'