Thu Nov 7 10:08:44 UTC 2024 I: starting to build mpi4py/trixie/armhf on jenkins on '2024-11-07 10:08' Thu Nov 7 10:08:44 UTC 2024 I: The jenkins build log is/was available at https://jenkins.debian.net/userContent/reproducible/debian/build_service/armhf_9/10027/console.log Thu Nov 7 10:08:44 UTC 2024 I: Downloading source for trixie/mpi4py=4.0.0-8 --2024-11-07 10:08:44-- http://deb.debian.org/debian/pool/main/m/mpi4py/mpi4py_4.0.0-8.dsc Connecting to 46.16.76.132:3128... connected. Proxy request sent, awaiting response... 200 OK Length: 2510 (2.5K) [text/prs.lines.tag] Saving to: ‘mpi4py_4.0.0-8.dsc’ 0K .. 100% 173M=0s 2024-11-07 10:08:44 (173 MB/s) - ‘mpi4py_4.0.0-8.dsc’ saved [2510/2510] Thu Nov 7 10:08:44 UTC 2024 I: mpi4py_4.0.0-8.dsc -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256 Format: 3.0 (quilt) Source: mpi4py Binary: python3-mpi4py, python-mpi4py-doc Architecture: any all Version: 4.0.0-8 Maintainer: Debian Science Maintainers Uploaders: Drew Parsons , Yaroslav Halchenko , Michael Hanke , Homepage: https://github.com/mpi4py/mpi4py Standards-Version: 4.7.0 Vcs-Browser: https://salsa.debian.org/science-team/mpi4py Vcs-Git: https://salsa.debian.org/science-team/mpi4py.git Testsuite: autopkgtest Testsuite-Triggers: libpmix-dev, libucx-dev, python3-all, python3-cffi, python3-dill, python3-numpy, python3-setuptools, python3-simplejson, python3-yaml Build-Depends: debhelper-compat (= 13), dh-sequence-python3, mpi-default-dev, mpi-default-bin, openssh-client, cython3, python3-all-dev, python3-numpy, python3-sphinx, python3-setuptools Build-Depends-Indep: texinfo, texlive, latexmk, tex-gyre, texlive-latex-extra, python3-doc, python-numpy-doc Package-List: python-mpi4py-doc deb doc optional arch=all python3-mpi4py deb python optional arch=any Checksums-Sha1: d2755c8e1e2777a4e00b8853f3ff27bc4a8e40ab 467681 mpi4py_4.0.0.orig.tar.gz ccad6db0b8d65bae2735528ecd950ebc59347bfc 13932 mpi4py_4.0.0-8.debian.tar.xz Checksums-Sha256: be7979a72f7cede836d72ca9423227bfbfaca65e5d9e7f3639bb52e11dda2cc6 467681 mpi4py_4.0.0.orig.tar.gz ed58ad4f6e763f747916a88b92e8512bf80a041afd3f87b38340a0f1573b44a1 13932 mpi4py_4.0.0-8.debian.tar.xz Files: 554f680871fc2960d2818201afa6c467 467681 mpi4py_4.0.0.orig.tar.gz 5c82697c1178afb0887debed7757f004 13932 mpi4py_4.0.0-8.debian.tar.xz -----BEGIN PGP SIGNATURE----- iQIzBAEBCAAdFiEEI8mpPlhYGekSbQo2Vz7x5L1aAfoFAmb5EPgACgkQVz7x5L1a Afof0A/6Ai96J4loVz4EXgyeAAumkK2InkT8lHkAt3f21Mo3+9xCpWMf+Mx5ECta rAQ6vySmCV8tyQM5ploM70RSqhORq9FfiCMbgG4a+zibne+Hc8sg8oF5VXz1SmnT rnFtsTr3DF6rap8rG9cBMbuS/j484MU6ffh2EdMls5KDf5P2dwnRnj50VWNfLvrT wiCkLgqTN8sxZ6AkD4JJzwqaDLa7/rYEsdCZXLW0XR8tTaNlGE913eTa9f0veYIU plOdRn980L/DXKSX9YI2wDQXudCiNbY/vctt+nMe6YGQBbsM0Gw2KyBZuP44XhF6 Zt6SmKQUKKYVQfSiT/6POn+WYYcEyqWs9eYJU3/RbyV+1ttepdNxjgnnpFjzKJe/ gki20DZ3HXjjUUj09ScwBvYr5sMF3iC2EXoiIV/DPug6Z/jWC8zG9tKiqoYh8E8z HGk/N1EK/78tFouAsrxy/cnOVjg9bk24JS/zCxuSjIcb1fnowDSTetdXmU5FjZaX ohSOs+ZjiEJDCzdFZ7Yz9X+fB3BYudm3hAfM+Q+ZGTmng1alB8vloX8lIuQ0FboR VA5+9pHSP3r0T4DkIbHOynOJZKEIiScTCb5wFypvoIlpL5YrlYhyX3nI+v+j2Ss9 VlRa/yRHyuTkDYpQWT10WgZWc0DRhENyDcwezAr8/bz04uiOWeQ= =JTFW -----END PGP SIGNATURE----- Thu Nov 7 10:08:44 UTC 2024 I: Checking whether the package is not for us Thu Nov 7 10:08:44 UTC 2024 I: Starting 1st build on remote node virt32a-armhf-rb.debian.net. Thu Nov 7 10:08:44 UTC 2024 I: Preparing to do remote build '1' on virt32a-armhf-rb.debian.net. Thu Nov 7 11:00:28 UTC 2024 I: Deleting $TMPDIR on virt32a-armhf-rb.debian.net. I: pbuilder: network access will be disabled during build I: Current time: Wed Nov 6 22:08:51 -12 2024 I: pbuilder-time-stamp: 1730974131 I: Building the build Environment I: extracting base tarball [/var/cache/pbuilder/trixie-reproducible-base.tgz] I: copying local configuration W: --override-config is not set; not updating apt.conf Read the manpage for details. I: mounting /proc filesystem I: mounting /sys filesystem I: creating /{dev,run}/shm I: mounting /dev/pts filesystem I: redirecting /dev/ptmx to /dev/pts/ptmx I: policy-rc.d already exists I: Copying source file I: copying [mpi4py_4.0.0-8.dsc] I: copying [./mpi4py_4.0.0.orig.tar.gz] I: copying [./mpi4py_4.0.0-8.debian.tar.xz] I: Extracting source gpgv: Signature made Sun Sep 29 08:34:00 2024 gpgv: using RSA key 23C9A93E585819E9126D0A36573EF1E4BD5A01FA gpgv: Can't check signature: No public key dpkg-source: warning: cannot verify inline signature for ./mpi4py_4.0.0-8.dsc: no acceptable signature found dpkg-source: info: extracting mpi4py in mpi4py-4.0.0 dpkg-source: info: unpacking mpi4py_4.0.0.orig.tar.gz dpkg-source: info: unpacking mpi4py_4.0.0-8.debian.tar.xz dpkg-source: info: using patch list from debian/patches/series dpkg-source: info: applying intersphinx_use_local_inventory.patch dpkg-source: info: applying skip_ppc_failing_tests.patch dpkg-source: info: applying skip_s390x_failing_tests.patch dpkg-source: info: applying skip_testPackUnpackExternal_sparc64.patch dpkg-source: info: applying fix_symbols_PR549.patch I: Not using root during the build. I: Installing the build-deps I: user script /srv/workspace/pbuilder/20615/tmp/hooks/D02_print_environment starting I: set BUILDDIR='/build/reproducible-path' BUILDUSERGECOS='first user,first room,first work-phone,first home-phone,first other' BUILDUSERNAME='pbuilder1' BUILD_ARCH='armhf' DEBIAN_FRONTEND='noninteractive' DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=3 ' DISTRIBUTION='trixie' HOME='/root' HOST_ARCH='armhf' IFS=' ' INVOCATION_ID='c5b4c6e9f9614b368303e0841871615d' LANG='C' LANGUAGE='en_US:en' LC_ALL='C' MAIL='/var/mail/root' OPTIND='1' PATH='/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' PBCURRENTCOMMANDLINEOPERATION='build' PBUILDER_OPERATION='build' PBUILDER_PKGDATADIR='/usr/share/pbuilder' PBUILDER_PKGLIBDIR='/usr/lib/pbuilder' PBUILDER_SYSCONFDIR='/etc' PPID='20615' PS1='# ' PS2='> ' PS4='+ ' PWD='/' SHELL='/bin/bash' SHLVL='2' SUDO_COMMAND='/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' SUDO_GID='113' SUDO_UID='107' SUDO_USER='jenkins' TERM='unknown' TZ='/usr/share/zoneinfo/Etc/GMT+12' USER='root' _='/usr/bin/systemd-run' http_proxy='http://10.0.0.15:3142/' I: uname -a Linux virt32a 6.1.0-26-armmp-lpae #1 SMP Debian 6.1.112-1 (2024-09-30) armv7l GNU/Linux I: ls -l /bin lrwxrwxrwx 1 root root 7 Aug 4 21:30 /bin -> usr/bin I: user script /srv/workspace/pbuilder/20615/tmp/hooks/D02_print_environment finished -> Attempting to satisfy build-dependencies -> Creating pbuilder-satisfydepends-dummy package Package: pbuilder-satisfydepends-dummy Version: 0.invalid.0 Architecture: armhf Maintainer: Debian Pbuilder Team Description: Dummy package to satisfy dependencies with aptitude - created by pbuilder This package was created automatically by pbuilder to satisfy the build-dependencies of the package being currently built. Depends: debhelper-compat (= 13), dh-sequence-python3, mpi-default-dev, mpi-default-bin, openssh-client, cython3, python3-all-dev, python3-numpy, python3-sphinx, python3-setuptools, texinfo, texlive, latexmk, tex-gyre, texlive-latex-extra, python3-doc, python-numpy-doc dpkg-deb: building package 'pbuilder-satisfydepends-dummy' in '/tmp/satisfydepends-aptitude/pbuilder-satisfydepends-dummy.deb'. Selecting previously unselected package pbuilder-satisfydepends-dummy. (Reading database ... 19690 files and directories currently installed.) Preparing to unpack .../pbuilder-satisfydepends-dummy.deb ... Unpacking pbuilder-satisfydepends-dummy (0.invalid.0) ... dpkg: pbuilder-satisfydepends-dummy: dependency problems, but configuring anyway as you requested: pbuilder-satisfydepends-dummy depends on debhelper-compat (= 13); however: Package debhelper-compat is not installed. pbuilder-satisfydepends-dummy depends on dh-sequence-python3; however: Package dh-sequence-python3 is not installed. pbuilder-satisfydepends-dummy depends on mpi-default-dev; however: Package mpi-default-dev is not installed. pbuilder-satisfydepends-dummy depends on mpi-default-bin; however: Package mpi-default-bin is not installed. pbuilder-satisfydepends-dummy depends on openssh-client; however: Package openssh-client is not installed. pbuilder-satisfydepends-dummy depends on cython3; however: Package cython3 is not installed. pbuilder-satisfydepends-dummy depends on python3-all-dev; however: Package python3-all-dev is not installed. pbuilder-satisfydepends-dummy depends on python3-numpy; however: Package python3-numpy is not installed. pbuilder-satisfydepends-dummy depends on python3-sphinx; however: Package python3-sphinx is not installed. pbuilder-satisfydepends-dummy depends on python3-setuptools; however: Package python3-setuptools is not installed. pbuilder-satisfydepends-dummy depends on texinfo; however: Package texinfo is not installed. pbuilder-satisfydepends-dummy depends on texlive; however: Package texlive is not installed. pbuilder-satisfydepends-dummy depends on latexmk; however: Package latexmk is not installed. pbuilder-satisfydepends-dummy depends on tex-gyre; however: Package tex-gyre is not installed. pbuilder-satisfydepends-dummy depends on texlive-latex-extra; however: Package texlive-latex-extra is not installed. pbuilder-satisfydepends-dummy depends on python3-doc; however: Package python3-doc is not installed. pbuilder-satisfydepends-dummy depends on python-numpy-doc; however: Package python-numpy-doc is not installed. Setting up pbuilder-satisfydepends-dummy (0.invalid.0) ... Reading package lists... Building dependency tree... Reading state information... Initializing package states... Writing extended state information... Building tag database... pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) The following NEW packages will be installed: autoconf{a} automake{a} autopoint{a} autotools-dev{a} bsdextrautils{a} ca-certificates{a} cython3{a} debhelper{a} dh-autoreconf{a} dh-python{a} dh-strip-nondeterminism{a} docutils-common{a} dwz{a} file{a} fontconfig-config{a} fonts-dejavu-core{a} fonts-dejavu-mono{a} fonts-lmodern{a} gettext{a} gettext-base{a} gfortran-14{a} gfortran-14-arm-linux-gnueabihf{a} groff-base{a} hwloc-nox{a} intltool-debian{a} latexmk{a} libapache-pom-java{a} libarchive-zip-perl{a} libblas3{a} libbrotli1{a} libcairo2{a} libcbor0.10{a} libcom-err2{a} libcommons-logging-java{a} libcommons-parent-java{a} libdebhelper-perl{a} libedit2{a} libelf1t64{a} libexpat1{a} libexpat1-dev{a} libfido2-1{a} libfile-stripnondeterminism-perl{a} libfontbox-java{a} libfontconfig1{a} libfontenc1{a} libfreetype6{a} libgfortran-14-dev{a} libgfortran5{a} libglib2.0-0t64{a} libgraphite2-3{a} libgssapi-krb5-2{a} libharfbuzz0b{a} libhwloc-dev{a} libhwloc15{a} libice6{a} libicu72{a} libjs-jquery{a} libjs-sphinxdoc{a} libjs-underscore{a} libjson-perl{a} libk5crypto3{a} libkeyutils1{a} libkpathsea6{a} libkrb5-3{a} libkrb5support0{a} liblapack3{a} libltdl-dev{a} libltdl7{a} libmagic-mgc{a} libmagic1t64{a} libmpfi0{a} libmpich-dev{a} libmpich12{a} libnsl2{a} libnuma-dev{a} libnuma1{a} libpaper-utils{a} libpaper1{a} libpdfbox-java{a} libpipeline1{a} libpixman-1-0{a} libpng16-16t64{a} libpotrace0{a} libptexenc1{a} libpython3-all-dev{a} libpython3-dev{a} libpython3-stdlib{a} libpython3.12-dev{a} libpython3.12-minimal{a} libpython3.12-stdlib{a} libpython3.12t64{a} libreadline8t64{a} libslurm41t64{a} libsm6{a} libsynctex2{a} libteckit0{a} libtexlua53-5{a} libtext-unidecode-perl{a} libtirpc-common{a} libtirpc3t64{a} libtool{a} libuchardet0{a} libx11-6{a} libx11-data{a} libxau6{a} libxaw7{a} libxcb-render0{a} libxcb-shm0{a} libxcb1{a} libxdmcp6{a} libxext6{a} libxi6{a} libxml-libxml-perl{a} libxml-namespacesupport-perl{a} libxml-sax-base-perl{a} libxml-sax-perl{a} libxml2{a} libxmu6{a} libxpm4{a} libxrender1{a} libxt6t64{a} libzzip-0-13t64{a} m4{a} man-db{a} media-types{a} mpi-default-bin{a} mpi-default-dev{a} mpich{a} netbase{a} openssh-client{a} openssl{a} po-debconf{a} preview-latex-style{a} python-babel-localedata{a} python-numpy-doc{a} python3{a} python3-alabaster{a} python3-all{a} python3-all-dev{a} python3-autocommand{a} python3-babel{a} python3-certifi{a} python3-chardet{a} python3-charset-normalizer{a} python3-defusedxml{a} python3-dev{a} python3-doc{a} python3-docutils{a} python3-idna{a} python3-imagesize{a} python3-inflect{a} python3-jaraco.context{a} python3-jaraco.functools{a} python3-jinja2{a} python3-markupsafe{a} python3-minimal{a} python3-more-itertools{a} python3-numpy{a} python3-packaging{a} python3-pkg-resources{a} python3-pygments{a} python3-requests{a} python3-roman{a} python3-setuptools{a} python3-snowballstemmer{a} python3-sphinx{a} python3-typeguard{a} python3-typing-extensions{a} python3-urllib3{a} python3-zipp{a} python3.12{a} python3.12-dev{a} python3.12-doc{a} python3.12-minimal{a} readline-common{a} sensible-utils{a} sgml-base{a} sphinx-common{a} t1utils{a} tex-common{a} tex-gyre{a} texinfo{a} texinfo-lib{a} texlive{a} texlive-base{a} texlive-binaries{a} texlive-fonts-recommended{a} texlive-latex-base{a} texlive-latex-extra{a} texlive-latex-recommended{a} texlive-pictures{a} tzdata{a} ucf{a} x11-common{a} xdg-utils{a} xfonts-encodings{a} xfonts-utils{a} xml-core{a} zlib1g-dev{a} The following packages are RECOMMENDED but will NOT be installed: apvlv curl default-jre dvisvgm evince fonts-texgyre fonts-texgyre-math ghostscript gv javascript-common krb5-locales libarchive-cpio-perl libfile-mimeinfo-perl libglib2.0-data libhwloc-plugins libjson-xs-perl libmail-sendmail-perl libnet-dbus-perl libspreadsheet-parseexcel-perl libwww-perl libx11-protocol-perl libxml-sax-expat-perl lmodern lynx mupdf okular python3-pil qpdfview qpdfview-ps-plugin ruby shared-mime-info texlive-plain-generic tipa tk viewpdf.app wget x11-utils x11-xserver-utils xauth xdg-user-dirs xpdf zathura-pdf-poppler zathura-ps 0 packages upgraded, 199 newly installed, 0 to remove and 0 not upgraded. Need to get 203 MB of archives. After unpacking 806 MB will be used. Writing extended state information... Get: 1 http://deb.debian.org/debian trixie/main armhf libpython3.12-minimal armhf 3.12.6-1 [800 kB] Get: 2 http://deb.debian.org/debian trixie/main armhf libexpat1 armhf 2.6.3-2 [83.1 kB] Get: 3 http://deb.debian.org/debian trixie/main armhf python3.12-minimal armhf 3.12.6-1 [1812 kB] Get: 4 http://deb.debian.org/debian trixie/main armhf python3-minimal armhf 3.12.6-1 [26.7 kB] Get: 5 http://deb.debian.org/debian trixie/main armhf media-types all 10.1.0 [26.9 kB] Get: 6 http://deb.debian.org/debian trixie/main armhf netbase all 6.4 [12.8 kB] Get: 7 http://deb.debian.org/debian trixie/main armhf tzdata all 2024a-4 [255 kB] Get: 8 http://deb.debian.org/debian trixie/main armhf libkrb5support0 armhf 1.21.3-3 [30.0 kB] Get: 9 http://deb.debian.org/debian trixie/main armhf libcom-err2 armhf 1.47.1-1+b1 [22.3 kB] Get: 10 http://deb.debian.org/debian trixie/main armhf libk5crypto3 armhf 1.21.3-3 [75.8 kB] Get: 11 http://deb.debian.org/debian trixie/main armhf libkeyutils1 armhf 1.6.3-4 [8096 B] Get: 12 http://deb.debian.org/debian trixie/main armhf libkrb5-3 armhf 1.21.3-3 [283 kB] Get: 13 http://deb.debian.org/debian trixie/main armhf libgssapi-krb5-2 armhf 1.21.3-3 [114 kB] Get: 14 http://deb.debian.org/debian trixie/main armhf libtirpc-common all 1.3.4+ds-1.3 [10.9 kB] Get: 15 http://deb.debian.org/debian trixie/main armhf libtirpc3t64 armhf 1.3.4+ds-1.3+b1 [71.3 kB] Get: 16 http://deb.debian.org/debian trixie/main armhf libnsl2 armhf 1.3.0-3+b3 [35.0 kB] Get: 17 http://deb.debian.org/debian trixie/main armhf readline-common all 8.2-5 [69.3 kB] Get: 18 http://deb.debian.org/debian trixie/main armhf libreadline8t64 armhf 8.2-5 [146 kB] Get: 19 http://deb.debian.org/debian trixie/main armhf libpython3.12-stdlib armhf 3.12.6-1 [1817 kB] Get: 20 http://deb.debian.org/debian trixie/main armhf python3.12 armhf 3.12.6-1 [669 kB] Get: 21 http://deb.debian.org/debian trixie/main armhf libpython3-stdlib armhf 3.12.6-1 [9692 B] Get: 22 http://deb.debian.org/debian trixie/main armhf python3 armhf 3.12.6-1 [27.8 kB] Get: 23 http://deb.debian.org/debian trixie/main armhf sgml-base all 1.31 [15.4 kB] Get: 24 http://deb.debian.org/debian trixie/main armhf sensible-utils all 0.0.24 [24.8 kB] Get: 25 http://deb.debian.org/debian trixie/main armhf openssl armhf 3.3.2-2 [1348 kB] Get: 26 http://deb.debian.org/debian trixie/main armhf ca-certificates all 20240203 [158 kB] Get: 27 http://deb.debian.org/debian trixie/main armhf libmagic-mgc armhf 1:5.45-3+b1 [314 kB] Get: 28 http://deb.debian.org/debian trixie/main armhf libmagic1t64 armhf 1:5.45-3+b1 [98.5 kB] Get: 29 http://deb.debian.org/debian trixie/main armhf file armhf 1:5.45-3+b1 [42.3 kB] Get: 30 http://deb.debian.org/debian trixie/main armhf gettext-base armhf 0.22.5-2 [195 kB] Get: 31 http://deb.debian.org/debian trixie/main armhf libuchardet0 armhf 0.0.8-1+b2 [65.6 kB] Get: 32 http://deb.debian.org/debian trixie/main armhf groff-base armhf 1.23.0-5 [1091 kB] Get: 33 http://deb.debian.org/debian trixie/main armhf bsdextrautils armhf 2.40.2-9 [88.8 kB] Get: 34 http://deb.debian.org/debian trixie/main armhf libpipeline1 armhf 1.5.8-1 [35.0 kB] Get: 35 http://deb.debian.org/debian trixie/main armhf man-db armhf 2.13.0-1 [1382 kB] Get: 36 http://deb.debian.org/debian trixie/main armhf libedit2 armhf 3.1-20240808-1 [77.9 kB] Get: 37 http://deb.debian.org/debian trixie/main armhf libcbor0.10 armhf 0.10.2-2 [24.3 kB] Get: 38 http://deb.debian.org/debian trixie/main armhf libfido2-1 armhf 1.15.0-1+b1 [71.2 kB] Get: 39 http://deb.debian.org/debian trixie/main armhf openssh-client armhf 1:9.9p1-3 [902 kB] Get: 40 http://deb.debian.org/debian trixie/main armhf ucf all 3.0043+nmu1 [55.2 kB] Get: 41 http://deb.debian.org/debian trixie/main armhf m4 armhf 1.4.19-4 [264 kB] Get: 42 http://deb.debian.org/debian trixie/main armhf autoconf all 2.72-3 [493 kB] Get: 43 http://deb.debian.org/debian trixie/main armhf autotools-dev all 20220109.1 [51.6 kB] Get: 44 http://deb.debian.org/debian trixie/main armhf automake all 1:1.16.5-1.3 [823 kB] Get: 45 http://deb.debian.org/debian trixie/main armhf autopoint all 0.22.5-2 [723 kB] Get: 46 http://deb.debian.org/debian trixie/main armhf cython3 armhf 3.0.11+dfsg-1 [2265 kB] Get: 47 http://deb.debian.org/debian trixie/main armhf libdebhelper-perl all 13.20 [89.7 kB] Get: 48 http://deb.debian.org/debian trixie/main armhf libtool all 2.4.7-7 [517 kB] Get: 49 http://deb.debian.org/debian trixie/main armhf dh-autoreconf all 20 [17.1 kB] Get: 50 http://deb.debian.org/debian trixie/main armhf libarchive-zip-perl all 1.68-1 [104 kB] Get: 51 http://deb.debian.org/debian trixie/main armhf libfile-stripnondeterminism-perl all 1.14.0-1 [19.5 kB] Get: 52 http://deb.debian.org/debian trixie/main armhf dh-strip-nondeterminism all 1.14.0-1 [8448 B] Get: 53 http://deb.debian.org/debian trixie/main armhf libelf1t64 armhf 0.192-4 [184 kB] Get: 54 http://deb.debian.org/debian trixie/main armhf dwz armhf 0.15-1+b2 [106 kB] Get: 55 http://deb.debian.org/debian trixie/main armhf libicu72 armhf 72.1-5+b1 [9088 kB] Get: 56 http://deb.debian.org/debian trixie/main armhf libxml2 armhf 2.12.7+dfsg+really2.9.14-0.1 [604 kB] Get: 57 http://deb.debian.org/debian trixie/main armhf gettext armhf 0.22.5-2 [1485 kB] Get: 58 http://deb.debian.org/debian trixie/main armhf intltool-debian all 0.35.0+20060710.6 [22.9 kB] Get: 59 http://deb.debian.org/debian trixie/main armhf po-debconf all 1.0.21+nmu1 [248 kB] Get: 60 http://deb.debian.org/debian trixie/main armhf debhelper all 13.20 [915 kB] Get: 61 http://deb.debian.org/debian trixie/main armhf python3-autocommand all 2.2.2-3 [13.6 kB] Get: 62 http://deb.debian.org/debian trixie/main armhf python3-more-itertools all 10.5.0-1 [63.8 kB] Get: 63 http://deb.debian.org/debian trixie/main armhf python3-typing-extensions all 4.12.2-2 [73.0 kB] Get: 64 http://deb.debian.org/debian trixie/main armhf python3-typeguard all 4.4.1-1 [37.0 kB] Get: 65 http://deb.debian.org/debian trixie/main armhf python3-inflect all 7.3.1-2 [32.4 kB] Get: 66 http://deb.debian.org/debian trixie/main armhf python3-jaraco.context all 6.0.0-1 [7984 B] Get: 67 http://deb.debian.org/debian trixie/main armhf python3-jaraco.functools all 4.1.0-1 [12.0 kB] Get: 68 http://deb.debian.org/debian trixie/main armhf python3-pkg-resources all 74.1.2-2 [213 kB] Get: 69 http://deb.debian.org/debian trixie/main armhf python3-zipp all 3.20.2-1 [10.3 kB] Get: 70 http://deb.debian.org/debian trixie/main armhf python3-setuptools all 74.1.2-2 [736 kB] Get: 71 http://deb.debian.org/debian trixie/main armhf dh-python all 6.20241024 [109 kB] Get: 72 http://deb.debian.org/debian trixie/main armhf xml-core all 0.19 [20.1 kB] Get: 73 http://deb.debian.org/debian trixie/main armhf docutils-common all 0.21.2+dfsg-2 [128 kB] Get: 74 http://deb.debian.org/debian trixie/main armhf fonts-dejavu-mono all 2.37-8 [489 kB] Get: 75 http://deb.debian.org/debian trixie/main armhf fonts-dejavu-core all 2.37-8 [840 kB] Get: 76 http://deb.debian.org/debian trixie/main armhf fontconfig-config armhf 2.15.0-1.1+b1 [318 kB] Get: 77 http://deb.debian.org/debian trixie/main armhf fonts-lmodern all 2.005-1 [4540 kB] Get: 78 http://deb.debian.org/debian trixie/main armhf libgfortran5 armhf 14.2.0-6 [263 kB] Get: 79 http://deb.debian.org/debian trixie/main armhf libgfortran-14-dev armhf 14.2.0-6 [315 kB] Get: 80 http://deb.debian.org/debian trixie/main armhf gfortran-14-arm-linux-gnueabihf armhf 14.2.0-6 [8718 kB] Get: 81 http://deb.debian.org/debian trixie/main armhf gfortran-14 armhf 14.2.0-6 [11.9 kB] Get: 82 http://deb.debian.org/debian trixie/main armhf libhwloc15 armhf 2.11.2-1 [134 kB] Get: 83 http://deb.debian.org/debian trixie/main armhf hwloc-nox armhf 2.11.2-1 [201 kB] Get: 84 http://deb.debian.org/debian trixie/main armhf tex-common all 6.18 [32.5 kB] Get: 85 http://deb.debian.org/debian trixie/main armhf libpaper1 armhf 1.1.29+b2 [12.2 kB] Get: 86 http://deb.debian.org/debian trixie/main armhf libpaper-utils armhf 1.1.29+b2 [8700 B] Get: 87 http://deb.debian.org/debian trixie/main armhf libkpathsea6 armhf 2024.20240313.70630+ds-4 [147 kB] Get: 88 http://deb.debian.org/debian trixie/main armhf libptexenc1 armhf 2024.20240313.70630+ds-4 [44.3 kB] Get: 89 http://deb.debian.org/debian trixie/main armhf libsynctex2 armhf 2024.20240313.70630+ds-4 [49.2 kB] Get: 90 http://deb.debian.org/debian trixie/main armhf libtexlua53-5 armhf 2024.20240313.70630+ds-4 [82.7 kB] Get: 91 http://deb.debian.org/debian trixie/main armhf t1utils armhf 1.41-4 [54.7 kB] Get: 92 http://deb.debian.org/debian trixie/main armhf libbrotli1 armhf 1.1.0-2+b5 [296 kB] Get: 93 http://deb.debian.org/debian trixie/main armhf libpng16-16t64 armhf 1.6.44-2 [263 kB] Get: 94 http://deb.debian.org/debian trixie/main armhf libfreetype6 armhf 2.13.3+dfsg-1 [385 kB] Get: 95 http://deb.debian.org/debian trixie/main armhf libfontconfig1 armhf 2.15.0-1.1+b1 [371 kB] Get: 96 http://deb.debian.org/debian trixie/main armhf libpixman-1-0 armhf 0.42.2-1+b1 [476 kB] Get: 97 http://deb.debian.org/debian trixie/main armhf libxau6 armhf 1:1.0.11-1 [19.7 kB] Get: 98 http://deb.debian.org/debian trixie/main armhf libxdmcp6 armhf 1:1.1.2-3+b2 [23.0 kB] Get: 99 http://deb.debian.org/debian trixie/main armhf libxcb1 armhf 1.17.0-2+b1 [140 kB] Get: 100 http://deb.debian.org/debian trixie/main armhf libx11-data all 2:1.8.7-1 [328 kB] Get: 101 http://deb.debian.org/debian trixie/main armhf libx11-6 armhf 2:1.8.7-1+b2 [741 kB] Get: 102 http://deb.debian.org/debian trixie/main armhf libxcb-render0 armhf 1.17.0-2+b1 [114 kB] Get: 103 http://deb.debian.org/debian trixie/main armhf libxcb-shm0 armhf 1.17.0-2+b1 [105 kB] Get: 104 http://deb.debian.org/debian trixie/main armhf libxext6 armhf 2:1.3.4-1+b2 [45.2 kB] Get: 105 http://deb.debian.org/debian trixie/main armhf libxrender1 armhf 1:0.9.10-1.1+b2 [25.0 kB] Get: 106 http://deb.debian.org/debian trixie/main armhf libcairo2 armhf 1.18.2-2 [443 kB] Get: 107 http://deb.debian.org/debian trixie/main armhf libgraphite2-3 armhf 1.3.14-2+b1 [63.1 kB] Get: 108 http://deb.debian.org/debian trixie/main armhf libglib2.0-0t64 armhf 2.82.2-2 [1326 kB] Get: 109 http://deb.debian.org/debian trixie/main armhf libharfbuzz0b armhf 10.0.1-1 [418 kB] Get: 110 http://deb.debian.org/debian trixie/main armhf libmpfi0 armhf 1.5.4+ds-3 [28.6 kB] Get: 111 http://deb.debian.org/debian trixie/main armhf libpotrace0 armhf 1.16-2+b2 [22.7 kB] Get: 112 http://deb.debian.org/debian trixie/main armhf libteckit0 armhf 2.5.12+ds1-1+b1 [259 kB] Get: 113 http://deb.debian.org/debian trixie/main armhf x11-common all 1:7.7+23.1 [216 kB] Get: 114 http://deb.debian.org/debian trixie/main armhf libice6 armhf 2:1.1.1-1 [58.5 kB] Get: 115 http://deb.debian.org/debian trixie/main armhf libsm6 armhf 2:1.2.4-1 [33.5 kB] Get: 116 http://deb.debian.org/debian trixie/main armhf libxt6t64 armhf 1:1.2.1-1.2+b1 [160 kB] Get: 117 http://deb.debian.org/debian trixie/main armhf libxmu6 armhf 2:1.1.3-3+b3 [51.2 kB] Get: 118 http://deb.debian.org/debian trixie/main armhf libxpm4 armhf 1:3.5.17-1+b2 [50.4 kB] Get: 119 http://deb.debian.org/debian trixie/main armhf libxaw7 armhf 2:1.0.14-1+b3 [166 kB] Get: 120 http://deb.debian.org/debian trixie/main armhf libxi6 armhf 2:1.8.2-1 [73.6 kB] Get: 121 http://deb.debian.org/debian trixie/main armhf libzzip-0-13t64 armhf 0.13.72+dfsg.1-1.2+b1 [53.0 kB] Get: 122 http://deb.debian.org/debian trixie/main armhf texlive-binaries armhf 2024.20240313.70630+ds-4 [6043 kB] Get: 123 http://deb.debian.org/debian trixie/main armhf xdg-utils all 1.1.3-4.1 [75.5 kB] Get: 124 http://deb.debian.org/debian trixie/main armhf texlive-base all 2024.20240829-2 [22.7 MB] Get: 125 http://deb.debian.org/debian trixie/main armhf texlive-latex-base all 2024.20240829-2 [1273 kB] Get: 126 http://deb.debian.org/debian trixie/main armhf latexmk all 1:4.85-1 [501 kB] Get: 127 http://deb.debian.org/debian trixie/main armhf libapache-pom-java all 33-2 [5852 B] Get: 128 http://deb.debian.org/debian trixie/main armhf libblas3 armhf 3.12.0-3+b1 [111 kB] Get: 129 http://deb.debian.org/debian trixie/main armhf libcommons-parent-java all 56-1 [10.8 kB] Get: 130 http://deb.debian.org/debian trixie/main armhf libcommons-logging-java all 1.3.0-1 [68.6 kB] Get: 131 http://deb.debian.org/debian trixie/main armhf libexpat1-dev armhf 2.6.3-2 [140 kB] Get: 132 http://deb.debian.org/debian trixie/main armhf libfontbox-java all 1:1.8.16-5 [211 kB] Get: 133 http://deb.debian.org/debian trixie/main armhf libfontenc1 armhf 1:1.1.8-1+b1 [20.8 kB] Get: 134 http://deb.debian.org/debian trixie/main armhf libnuma1 armhf 2.0.18-1+b1 [18.9 kB] Get: 135 http://deb.debian.org/debian trixie/main armhf libnuma-dev armhf 2.0.18-1+b1 [34.6 kB] Get: 136 http://deb.debian.org/debian trixie/main armhf libltdl7 armhf 2.4.7-7+b2 [390 kB] Get: 137 http://deb.debian.org/debian trixie/main armhf libltdl-dev armhf 2.4.7-7+b2 [162 kB] Get: 138 http://deb.debian.org/debian trixie/main armhf libhwloc-dev armhf 2.11.2-1 [226 kB] Get: 139 http://deb.debian.org/debian trixie/main armhf libjs-jquery all 3.6.1+dfsg+~3.5.14-1 [326 kB] Get: 140 http://deb.debian.org/debian trixie/main armhf libjs-underscore all 1.13.4~dfsg+~1.11.4-3 [116 kB] Get: 141 http://deb.debian.org/debian trixie/main armhf libjs-sphinxdoc all 7.4.7-4 [158 kB] Get: 142 http://deb.debian.org/debian trixie/main armhf libjson-perl all 4.10000-1 [87.5 kB] Get: 143 http://deb.debian.org/debian trixie/main armhf liblapack3 armhf 3.12.0-3+b1 [1828 kB] Get: 144 http://deb.debian.org/debian trixie/main armhf libmpich12 armhf 4.2.0-14 [1498 kB] Get: 145 http://deb.debian.org/debian trixie/main armhf libslurm41t64 armhf 24.05.2-1 [682 kB] Get: 146 http://deb.debian.org/debian trixie/main armhf mpich armhf 4.2.0-14 [223 kB] Get: 147 http://deb.debian.org/debian trixie/main armhf libmpich-dev armhf 4.2.0-14 [2410 kB] Get: 148 http://deb.debian.org/debian trixie/main armhf libpdfbox-java all 1:1.8.16-5 [5527 kB] Get: 149 http://deb.debian.org/debian trixie/main armhf libpython3.12t64 armhf 3.12.6-1 [1847 kB] Get: 150 http://deb.debian.org/debian trixie/main armhf zlib1g-dev armhf 1:1.3.dfsg+really1.3.1-1+b1 [905 kB] Get: 151 http://deb.debian.org/debian trixie/main armhf libpython3.12-dev armhf 3.12.6-1 [3809 kB] Get: 152 http://deb.debian.org/debian trixie/main armhf libpython3-dev armhf 3.12.6-1 [9952 B] Get: 153 http://deb.debian.org/debian trixie/main armhf libpython3-all-dev armhf 3.12.6-1 [1064 B] Get: 154 http://deb.debian.org/debian trixie/main armhf libtext-unidecode-perl all 1.30-3 [101 kB] Get: 155 http://deb.debian.org/debian trixie/main armhf libxml-namespacesupport-perl all 1.12-2 [15.1 kB] Get: 156 http://deb.debian.org/debian trixie/main armhf libxml-sax-base-perl all 1.09-3 [20.6 kB] Get: 157 http://deb.debian.org/debian trixie/main armhf libxml-sax-perl all 1.02+dfsg-3 [59.4 kB] Get: 158 http://deb.debian.org/debian trixie/main armhf libxml-libxml-perl armhf 2.0207+dfsg+really+2.0134-5+b1 [298 kB] Get: 159 http://deb.debian.org/debian trixie/main armhf mpi-default-bin armhf 1.17 [2368 B] Get: 160 http://deb.debian.org/debian trixie/main armhf mpi-default-dev armhf 1.17 [3148 B] Get: 161 http://deb.debian.org/debian trixie/main armhf preview-latex-style all 13.2-1 [350 kB] Get: 162 http://deb.debian.org/debian trixie/main armhf python-babel-localedata all 2.14.0-1 [5701 kB] Get: 163 http://deb.debian.org/debian trixie/main armhf python-numpy-doc all 1:1.26.4+ds-11 [9384 kB] Get: 164 http://deb.debian.org/debian trixie/main armhf python3-alabaster all 0.7.16-0.1 [27.9 kB] Get: 165 http://deb.debian.org/debian trixie/main armhf python3-all armhf 3.12.6-1 [1040 B] Get: 166 http://deb.debian.org/debian trixie/main armhf python3.12-dev armhf 3.12.6-1 [506 kB] Get: 167 http://deb.debian.org/debian trixie/main armhf python3-dev armhf 3.12.6-1 [26.1 kB] Get: 168 http://deb.debian.org/debian trixie/main armhf python3-all-dev armhf 3.12.6-1 [1068 B] Get: 169 http://deb.debian.org/debian trixie/main armhf python3-babel all 2.14.0-1 [111 kB] Get: 170 http://deb.debian.org/debian trixie/main armhf python3-certifi all 2024.8.30+dfsg-1 [9576 B] Get: 171 http://deb.debian.org/debian trixie/main armhf python3-chardet all 5.2.0+dfsg-1 [107 kB] Get: 172 http://deb.debian.org/debian trixie/main armhf python3-charset-normalizer armhf 3.4.0-1 [111 kB] Get: 173 http://deb.debian.org/debian trixie/main armhf python3-defusedxml all 0.7.1-2 [43.3 kB] Get: 174 http://deb.debian.org/debian trixie/main armhf python3.12-doc all 3.12.6-1 [13.1 MB] Get: 175 http://deb.debian.org/debian trixie/main armhf python3-doc all 3.12.6-1 [9884 B] Get: 176 http://deb.debian.org/debian trixie/main armhf python3-roman all 4.2-1 [10.4 kB] Get: 177 http://deb.debian.org/debian trixie/main armhf python3-docutils all 0.21.2+dfsg-2 [403 kB] Get: 178 http://deb.debian.org/debian trixie/main armhf python3-idna all 3.8-2 [41.6 kB] Get: 179 http://deb.debian.org/debian trixie/main armhf python3-imagesize all 1.4.1-1 [6688 B] Get: 180 http://deb.debian.org/debian trixie/main armhf python3-markupsafe armhf 2.1.5-1+b1 [13.2 kB] Get: 181 http://deb.debian.org/debian trixie/main armhf python3-jinja2 all 3.1.3-1 [119 kB] Get: 182 http://deb.debian.org/debian trixie/main armhf python3-numpy armhf 1:1.26.4+ds-11 [3340 kB] Get: 183 http://deb.debian.org/debian trixie/main armhf python3-packaging all 24.1-1 [45.8 kB] Get: 184 http://deb.debian.org/debian trixie/main armhf python3-pygments all 2.18.0+dfsg-1 [836 kB] Get: 185 http://deb.debian.org/debian trixie/main armhf python3-urllib3 all 2.0.7-2 [111 kB] Get: 186 http://deb.debian.org/debian trixie/main armhf python3-requests all 2.32.3+dfsg-1 [71.9 kB] Get: 187 http://deb.debian.org/debian trixie/main armhf python3-snowballstemmer all 2.2.0-4 [58.0 kB] Get: 188 http://deb.debian.org/debian trixie/main armhf sphinx-common all 7.4.7-4 [731 kB] Get: 189 http://deb.debian.org/debian trixie/main armhf python3-sphinx all 7.4.7-4 [588 kB] Get: 190 http://deb.debian.org/debian trixie/main armhf xfonts-encodings all 1:1.0.4-2.2 [577 kB] Get: 191 http://deb.debian.org/debian trixie/main armhf xfonts-utils armhf 1:7.7+7 [84.2 kB] Get: 192 http://deb.debian.org/debian trixie/main armhf tex-gyre all 20180621-6 [6209 kB] Get: 193 http://deb.debian.org/debian trixie/main armhf texinfo-lib armhf 7.1.1-1+b1 [209 kB] Get: 194 http://deb.debian.org/debian trixie/main armhf texinfo all 7.1.1-1 [1753 kB] Get: 195 http://deb.debian.org/debian trixie/main armhf texlive-fonts-recommended all 2024.20240829-2 [4990 kB] Get: 196 http://deb.debian.org/debian trixie/main armhf texlive-latex-recommended all 2024.20240829-2 [8845 kB] Get: 197 http://deb.debian.org/debian trixie/main armhf texlive all 2024.20240829-2 [18.6 kB] Get: 198 http://deb.debian.org/debian trixie/main armhf texlive-pictures all 2024.20240829-2 [17.0 MB] Get: 199 http://deb.debian.org/debian trixie/main armhf texlive-latex-extra all 2024.20240829-1 [20.9 MB] Fetched 203 MB in 5s (41.3 MB/s) debconf: delaying package configuration, since apt-utils is not installed Selecting previously unselected package libpython3.12-minimal:armhf. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 19690 files and directories currently installed.) Preparing to unpack .../libpython3.12-minimal_3.12.6-1_armhf.deb ... Unpacking libpython3.12-minimal:armhf (3.12.6-1) ... Selecting previously unselected package libexpat1:armhf. Preparing to unpack .../libexpat1_2.6.3-2_armhf.deb ... Unpacking libexpat1:armhf (2.6.3-2) ... Selecting previously unselected package python3.12-minimal. Preparing to unpack .../python3.12-minimal_3.12.6-1_armhf.deb ... Unpacking python3.12-minimal (3.12.6-1) ... Setting up libpython3.12-minimal:armhf (3.12.6-1) ... Setting up libexpat1:armhf (2.6.3-2) ... Setting up python3.12-minimal (3.12.6-1) ... Selecting previously unselected package python3-minimal. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 20010 files and directories currently installed.) Preparing to unpack .../00-python3-minimal_3.12.6-1_armhf.deb ... Unpacking python3-minimal (3.12.6-1) ... Selecting previously unselected package media-types. Preparing to unpack .../01-media-types_10.1.0_all.deb ... Unpacking media-types (10.1.0) ... Selecting previously unselected package netbase. Preparing to unpack .../02-netbase_6.4_all.deb ... Unpacking netbase (6.4) ... Selecting previously unselected package tzdata. Preparing to unpack .../03-tzdata_2024a-4_all.deb ... Unpacking tzdata (2024a-4) ... Selecting previously unselected package libkrb5support0:armhf. Preparing to unpack .../04-libkrb5support0_1.21.3-3_armhf.deb ... Unpacking libkrb5support0:armhf (1.21.3-3) ... Selecting previously unselected package libcom-err2:armhf. Preparing to unpack .../05-libcom-err2_1.47.1-1+b1_armhf.deb ... Unpacking libcom-err2:armhf (1.47.1-1+b1) ... Selecting previously unselected package libk5crypto3:armhf. Preparing to unpack .../06-libk5crypto3_1.21.3-3_armhf.deb ... Unpacking libk5crypto3:armhf (1.21.3-3) ... Selecting previously unselected package libkeyutils1:armhf. Preparing to unpack .../07-libkeyutils1_1.6.3-4_armhf.deb ... Unpacking libkeyutils1:armhf (1.6.3-4) ... Selecting previously unselected package libkrb5-3:armhf. Preparing to unpack .../08-libkrb5-3_1.21.3-3_armhf.deb ... Unpacking libkrb5-3:armhf (1.21.3-3) ... Selecting previously unselected package libgssapi-krb5-2:armhf. Preparing to unpack .../09-libgssapi-krb5-2_1.21.3-3_armhf.deb ... Unpacking libgssapi-krb5-2:armhf (1.21.3-3) ... Selecting previously unselected package libtirpc-common. Preparing to unpack .../10-libtirpc-common_1.3.4+ds-1.3_all.deb ... Unpacking libtirpc-common (1.3.4+ds-1.3) ... Selecting previously unselected package libtirpc3t64:armhf. Preparing to unpack .../11-libtirpc3t64_1.3.4+ds-1.3+b1_armhf.deb ... Adding 'diversion of /lib/arm-linux-gnueabihf/libtirpc.so.3 to /lib/arm-linux-gnueabihf/libtirpc.so.3.usr-is-merged by libtirpc3t64' Adding 'diversion of /lib/arm-linux-gnueabihf/libtirpc.so.3.0.0 to /lib/arm-linux-gnueabihf/libtirpc.so.3.0.0.usr-is-merged by libtirpc3t64' Unpacking libtirpc3t64:armhf (1.3.4+ds-1.3+b1) ... Selecting previously unselected package libnsl2:armhf. Preparing to unpack .../12-libnsl2_1.3.0-3+b3_armhf.deb ... Unpacking libnsl2:armhf (1.3.0-3+b3) ... Selecting previously unselected package readline-common. Preparing to unpack .../13-readline-common_8.2-5_all.deb ... Unpacking readline-common (8.2-5) ... Selecting previously unselected package libreadline8t64:armhf. Preparing to unpack .../14-libreadline8t64_8.2-5_armhf.deb ... Adding 'diversion of /lib/arm-linux-gnueabihf/libhistory.so.8 to /lib/arm-linux-gnueabihf/libhistory.so.8.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/arm-linux-gnueabihf/libhistory.so.8.2 to /lib/arm-linux-gnueabihf/libhistory.so.8.2.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/arm-linux-gnueabihf/libreadline.so.8 to /lib/arm-linux-gnueabihf/libreadline.so.8.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/arm-linux-gnueabihf/libreadline.so.8.2 to /lib/arm-linux-gnueabihf/libreadline.so.8.2.usr-is-merged by libreadline8t64' Unpacking libreadline8t64:armhf (8.2-5) ... Selecting previously unselected package libpython3.12-stdlib:armhf. Preparing to unpack .../15-libpython3.12-stdlib_3.12.6-1_armhf.deb ... Unpacking libpython3.12-stdlib:armhf (3.12.6-1) ... Selecting previously unselected package python3.12. Preparing to unpack .../16-python3.12_3.12.6-1_armhf.deb ... Unpacking python3.12 (3.12.6-1) ... Selecting previously unselected package libpython3-stdlib:armhf. Preparing to unpack .../17-libpython3-stdlib_3.12.6-1_armhf.deb ... Unpacking libpython3-stdlib:armhf (3.12.6-1) ... Setting up python3-minimal (3.12.6-1) ... Selecting previously unselected package python3. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 21084 files and directories currently installed.) Preparing to unpack .../000-python3_3.12.6-1_armhf.deb ... Unpacking python3 (3.12.6-1) ... Selecting previously unselected package sgml-base. Preparing to unpack .../001-sgml-base_1.31_all.deb ... Unpacking sgml-base (1.31) ... Selecting previously unselected package sensible-utils. Preparing to unpack .../002-sensible-utils_0.0.24_all.deb ... Unpacking sensible-utils (0.0.24) ... Selecting previously unselected package openssl. Preparing to unpack .../003-openssl_3.3.2-2_armhf.deb ... Unpacking openssl (3.3.2-2) ... Selecting previously unselected package ca-certificates. Preparing to unpack .../004-ca-certificates_20240203_all.deb ... Unpacking ca-certificates (20240203) ... Selecting previously unselected package libmagic-mgc. Preparing to unpack .../005-libmagic-mgc_1%3a5.45-3+b1_armhf.deb ... Unpacking libmagic-mgc (1:5.45-3+b1) ... Selecting previously unselected package libmagic1t64:armhf. Preparing to unpack .../006-libmagic1t64_1%3a5.45-3+b1_armhf.deb ... Unpacking libmagic1t64:armhf (1:5.45-3+b1) ... Selecting previously unselected package file. Preparing to unpack .../007-file_1%3a5.45-3+b1_armhf.deb ... Unpacking file (1:5.45-3+b1) ... Selecting previously unselected package gettext-base. Preparing to unpack .../008-gettext-base_0.22.5-2_armhf.deb ... Unpacking gettext-base (0.22.5-2) ... Selecting previously unselected package libuchardet0:armhf. Preparing to unpack .../009-libuchardet0_0.0.8-1+b2_armhf.deb ... Unpacking libuchardet0:armhf (0.0.8-1+b2) ... Selecting previously unselected package groff-base. Preparing to unpack .../010-groff-base_1.23.0-5_armhf.deb ... Unpacking groff-base (1.23.0-5) ... Selecting previously unselected package bsdextrautils. Preparing to unpack .../011-bsdextrautils_2.40.2-9_armhf.deb ... Unpacking bsdextrautils (2.40.2-9) ... Selecting previously unselected package libpipeline1:armhf. Preparing to unpack .../012-libpipeline1_1.5.8-1_armhf.deb ... Unpacking libpipeline1:armhf (1.5.8-1) ... Selecting previously unselected package man-db. Preparing to unpack .../013-man-db_2.13.0-1_armhf.deb ... Unpacking man-db (2.13.0-1) ... Selecting previously unselected package libedit2:armhf. Preparing to unpack .../014-libedit2_3.1-20240808-1_armhf.deb ... Unpacking libedit2:armhf (3.1-20240808-1) ... Selecting previously unselected package libcbor0.10:armhf. Preparing to unpack .../015-libcbor0.10_0.10.2-2_armhf.deb ... Unpacking libcbor0.10:armhf (0.10.2-2) ... Selecting previously unselected package libfido2-1:armhf. Preparing to unpack .../016-libfido2-1_1.15.0-1+b1_armhf.deb ... Unpacking libfido2-1:armhf (1.15.0-1+b1) ... Selecting previously unselected package openssh-client. Preparing to unpack .../017-openssh-client_1%3a9.9p1-3_armhf.deb ... Unpacking openssh-client (1:9.9p1-3) ... Selecting previously unselected package ucf. Preparing to unpack .../018-ucf_3.0043+nmu1_all.deb ... Moving old data out of the way Unpacking ucf (3.0043+nmu1) ... Selecting previously unselected package m4. Preparing to unpack .../019-m4_1.4.19-4_armhf.deb ... Unpacking m4 (1.4.19-4) ... Selecting previously unselected package autoconf. Preparing to unpack .../020-autoconf_2.72-3_all.deb ... Unpacking autoconf (2.72-3) ... Selecting previously unselected package autotools-dev. Preparing to unpack .../021-autotools-dev_20220109.1_all.deb ... Unpacking autotools-dev (20220109.1) ... Selecting previously unselected package automake. Preparing to unpack .../022-automake_1%3a1.16.5-1.3_all.deb ... Unpacking automake (1:1.16.5-1.3) ... Selecting previously unselected package autopoint. Preparing to unpack .../023-autopoint_0.22.5-2_all.deb ... Unpacking autopoint (0.22.5-2) ... Selecting previously unselected package cython3. Preparing to unpack .../024-cython3_3.0.11+dfsg-1_armhf.deb ... Unpacking cython3 (3.0.11+dfsg-1) ... Selecting previously unselected package libdebhelper-perl. Preparing to unpack .../025-libdebhelper-perl_13.20_all.deb ... Unpacking libdebhelper-perl (13.20) ... Selecting previously unselected package libtool. Preparing to unpack .../026-libtool_2.4.7-7_all.deb ... Unpacking libtool (2.4.7-7) ... Selecting previously unselected package dh-autoreconf. Preparing to unpack .../027-dh-autoreconf_20_all.deb ... Unpacking dh-autoreconf (20) ... Selecting previously unselected package libarchive-zip-perl. Preparing to unpack .../028-libarchive-zip-perl_1.68-1_all.deb ... Unpacking libarchive-zip-perl (1.68-1) ... Selecting previously unselected package libfile-stripnondeterminism-perl. Preparing to unpack .../029-libfile-stripnondeterminism-perl_1.14.0-1_all.deb ... Unpacking libfile-stripnondeterminism-perl (1.14.0-1) ... Selecting previously unselected package dh-strip-nondeterminism. Preparing to unpack .../030-dh-strip-nondeterminism_1.14.0-1_all.deb ... Unpacking dh-strip-nondeterminism (1.14.0-1) ... Selecting previously unselected package libelf1t64:armhf. Preparing to unpack .../031-libelf1t64_0.192-4_armhf.deb ... Unpacking libelf1t64:armhf (0.192-4) ... Selecting previously unselected package dwz. Preparing to unpack .../032-dwz_0.15-1+b2_armhf.deb ... Unpacking dwz (0.15-1+b2) ... Selecting previously unselected package libicu72:armhf. Preparing to unpack .../033-libicu72_72.1-5+b1_armhf.deb ... Unpacking libicu72:armhf (72.1-5+b1) ... Selecting previously unselected package libxml2:armhf. Preparing to unpack .../034-libxml2_2.12.7+dfsg+really2.9.14-0.1_armhf.deb ... Unpacking libxml2:armhf (2.12.7+dfsg+really2.9.14-0.1) ... Selecting previously unselected package gettext. Preparing to unpack .../035-gettext_0.22.5-2_armhf.deb ... Unpacking gettext (0.22.5-2) ... Selecting previously unselected package intltool-debian. Preparing to unpack .../036-intltool-debian_0.35.0+20060710.6_all.deb ... Unpacking intltool-debian (0.35.0+20060710.6) ... Selecting previously unselected package po-debconf. Preparing to unpack .../037-po-debconf_1.0.21+nmu1_all.deb ... Unpacking po-debconf (1.0.21+nmu1) ... Selecting previously unselected package debhelper. Preparing to unpack .../038-debhelper_13.20_all.deb ... Unpacking debhelper (13.20) ... Selecting previously unselected package python3-autocommand. Preparing to unpack .../039-python3-autocommand_2.2.2-3_all.deb ... Unpacking python3-autocommand (2.2.2-3) ... Selecting previously unselected package python3-more-itertools. Preparing to unpack .../040-python3-more-itertools_10.5.0-1_all.deb ... Unpacking python3-more-itertools (10.5.0-1) ... Selecting previously unselected package python3-typing-extensions. Preparing to unpack .../041-python3-typing-extensions_4.12.2-2_all.deb ... Unpacking python3-typing-extensions (4.12.2-2) ... Selecting previously unselected package python3-typeguard. Preparing to unpack .../042-python3-typeguard_4.4.1-1_all.deb ... Unpacking python3-typeguard (4.4.1-1) ... Selecting previously unselected package python3-inflect. Preparing to unpack .../043-python3-inflect_7.3.1-2_all.deb ... Unpacking python3-inflect (7.3.1-2) ... Selecting previously unselected package python3-jaraco.context. Preparing to unpack .../044-python3-jaraco.context_6.0.0-1_all.deb ... Unpacking python3-jaraco.context (6.0.0-1) ... Selecting previously unselected package python3-jaraco.functools. Preparing to unpack .../045-python3-jaraco.functools_4.1.0-1_all.deb ... Unpacking python3-jaraco.functools (4.1.0-1) ... Selecting previously unselected package python3-pkg-resources. Preparing to unpack .../046-python3-pkg-resources_74.1.2-2_all.deb ... Unpacking python3-pkg-resources (74.1.2-2) ... Selecting previously unselected package python3-zipp. Preparing to unpack .../047-python3-zipp_3.20.2-1_all.deb ... Unpacking python3-zipp (3.20.2-1) ... Selecting previously unselected package python3-setuptools. Preparing to unpack .../048-python3-setuptools_74.1.2-2_all.deb ... Unpacking python3-setuptools (74.1.2-2) ... Selecting previously unselected package dh-python. Preparing to unpack .../049-dh-python_6.20241024_all.deb ... Unpacking dh-python (6.20241024) ... Selecting previously unselected package xml-core. Preparing to unpack .../050-xml-core_0.19_all.deb ... Unpacking xml-core (0.19) ... Selecting previously unselected package docutils-common. Preparing to unpack .../051-docutils-common_0.21.2+dfsg-2_all.deb ... Unpacking docutils-common (0.21.2+dfsg-2) ... Selecting previously unselected package fonts-dejavu-mono. Preparing to unpack .../052-fonts-dejavu-mono_2.37-8_all.deb ... Unpacking fonts-dejavu-mono (2.37-8) ... Selecting previously unselected package fonts-dejavu-core. Preparing to unpack .../053-fonts-dejavu-core_2.37-8_all.deb ... Unpacking fonts-dejavu-core (2.37-8) ... Selecting previously unselected package fontconfig-config. Preparing to unpack .../054-fontconfig-config_2.15.0-1.1+b1_armhf.deb ... Unpacking fontconfig-config (2.15.0-1.1+b1) ... Selecting previously unselected package fonts-lmodern. Preparing to unpack .../055-fonts-lmodern_2.005-1_all.deb ... Unpacking fonts-lmodern (2.005-1) ... Selecting previously unselected package libgfortran5:armhf. Preparing to unpack .../056-libgfortran5_14.2.0-6_armhf.deb ... Unpacking libgfortran5:armhf (14.2.0-6) ... Selecting previously unselected package libgfortran-14-dev:armhf. Preparing to unpack .../057-libgfortran-14-dev_14.2.0-6_armhf.deb ... Unpacking libgfortran-14-dev:armhf (14.2.0-6) ... Selecting previously unselected package gfortran-14-arm-linux-gnueabihf. Preparing to unpack .../058-gfortran-14-arm-linux-gnueabihf_14.2.0-6_armhf.deb ... Unpacking gfortran-14-arm-linux-gnueabihf (14.2.0-6) ... Selecting previously unselected package gfortran-14. Preparing to unpack .../059-gfortran-14_14.2.0-6_armhf.deb ... Unpacking gfortran-14 (14.2.0-6) ... Selecting previously unselected package libhwloc15:armhf. Preparing to unpack .../060-libhwloc15_2.11.2-1_armhf.deb ... Unpacking libhwloc15:armhf (2.11.2-1) ... Selecting previously unselected package hwloc-nox. Preparing to unpack .../061-hwloc-nox_2.11.2-1_armhf.deb ... Unpacking hwloc-nox (2.11.2-1) ... Selecting previously unselected package tex-common. Preparing to unpack .../062-tex-common_6.18_all.deb ... Unpacking tex-common (6.18) ... Selecting previously unselected package libpaper1:armhf. Preparing to unpack .../063-libpaper1_1.1.29+b2_armhf.deb ... Unpacking libpaper1:armhf (1.1.29+b2) ... Selecting previously unselected package libpaper-utils. Preparing to unpack .../064-libpaper-utils_1.1.29+b2_armhf.deb ... Unpacking libpaper-utils (1.1.29+b2) ... Selecting previously unselected package libkpathsea6:armhf. Preparing to unpack .../065-libkpathsea6_2024.20240313.70630+ds-4_armhf.deb ... Unpacking libkpathsea6:armhf (2024.20240313.70630+ds-4) ... Selecting previously unselected package libptexenc1:armhf. Preparing to unpack .../066-libptexenc1_2024.20240313.70630+ds-4_armhf.deb ... Unpacking libptexenc1:armhf (2024.20240313.70630+ds-4) ... Selecting previously unselected package libsynctex2:armhf. Preparing to unpack .../067-libsynctex2_2024.20240313.70630+ds-4_armhf.deb ... Unpacking libsynctex2:armhf (2024.20240313.70630+ds-4) ... Selecting previously unselected package libtexlua53-5:armhf. Preparing to unpack .../068-libtexlua53-5_2024.20240313.70630+ds-4_armhf.deb ... Unpacking libtexlua53-5:armhf (2024.20240313.70630+ds-4) ... Selecting previously unselected package t1utils. Preparing to unpack .../069-t1utils_1.41-4_armhf.deb ... Unpacking t1utils (1.41-4) ... Selecting previously unselected package libbrotli1:armhf. Preparing to unpack .../070-libbrotli1_1.1.0-2+b5_armhf.deb ... Unpacking libbrotli1:armhf (1.1.0-2+b5) ... Selecting previously unselected package libpng16-16t64:armhf. Preparing to unpack .../071-libpng16-16t64_1.6.44-2_armhf.deb ... Unpacking libpng16-16t64:armhf (1.6.44-2) ... Selecting previously unselected package libfreetype6:armhf. Preparing to unpack .../072-libfreetype6_2.13.3+dfsg-1_armhf.deb ... Unpacking libfreetype6:armhf (2.13.3+dfsg-1) ... Selecting previously unselected package libfontconfig1:armhf. Preparing to unpack .../073-libfontconfig1_2.15.0-1.1+b1_armhf.deb ... Unpacking libfontconfig1:armhf (2.15.0-1.1+b1) ... Selecting previously unselected package libpixman-1-0:armhf. Preparing to unpack .../074-libpixman-1-0_0.42.2-1+b1_armhf.deb ... Unpacking libpixman-1-0:armhf (0.42.2-1+b1) ... Selecting previously unselected package libxau6:armhf. Preparing to unpack .../075-libxau6_1%3a1.0.11-1_armhf.deb ... Unpacking libxau6:armhf (1:1.0.11-1) ... Selecting previously unselected package libxdmcp6:armhf. Preparing to unpack .../076-libxdmcp6_1%3a1.1.2-3+b2_armhf.deb ... Unpacking libxdmcp6:armhf (1:1.1.2-3+b2) ... Selecting previously unselected package libxcb1:armhf. Preparing to unpack .../077-libxcb1_1.17.0-2+b1_armhf.deb ... Unpacking libxcb1:armhf (1.17.0-2+b1) ... Selecting previously unselected package libx11-data. Preparing to unpack .../078-libx11-data_2%3a1.8.7-1_all.deb ... Unpacking libx11-data (2:1.8.7-1) ... Selecting previously unselected package libx11-6:armhf. Preparing to unpack .../079-libx11-6_2%3a1.8.7-1+b2_armhf.deb ... Unpacking libx11-6:armhf (2:1.8.7-1+b2) ... Selecting previously unselected package libxcb-render0:armhf. Preparing to unpack .../080-libxcb-render0_1.17.0-2+b1_armhf.deb ... Unpacking libxcb-render0:armhf (1.17.0-2+b1) ... Selecting previously unselected package libxcb-shm0:armhf. Preparing to unpack .../081-libxcb-shm0_1.17.0-2+b1_armhf.deb ... Unpacking libxcb-shm0:armhf (1.17.0-2+b1) ... Selecting previously unselected package libxext6:armhf. Preparing to unpack .../082-libxext6_2%3a1.3.4-1+b2_armhf.deb ... Unpacking libxext6:armhf (2:1.3.4-1+b2) ... Selecting previously unselected package libxrender1:armhf. Preparing to unpack .../083-libxrender1_1%3a0.9.10-1.1+b2_armhf.deb ... Unpacking libxrender1:armhf (1:0.9.10-1.1+b2) ... Selecting previously unselected package libcairo2:armhf. Preparing to unpack .../084-libcairo2_1.18.2-2_armhf.deb ... Unpacking libcairo2:armhf (1.18.2-2) ... Selecting previously unselected package libgraphite2-3:armhf. Preparing to unpack .../085-libgraphite2-3_1.3.14-2+b1_armhf.deb ... Unpacking libgraphite2-3:armhf (1.3.14-2+b1) ... Selecting previously unselected package libglib2.0-0t64:armhf. Preparing to unpack .../086-libglib2.0-0t64_2.82.2-2_armhf.deb ... Unpacking libglib2.0-0t64:armhf (2.82.2-2) ... Selecting previously unselected package libharfbuzz0b:armhf. Preparing to unpack .../087-libharfbuzz0b_10.0.1-1_armhf.deb ... Unpacking libharfbuzz0b:armhf (10.0.1-1) ... Selecting previously unselected package libmpfi0:armhf. Preparing to unpack .../088-libmpfi0_1.5.4+ds-3_armhf.deb ... Unpacking libmpfi0:armhf (1.5.4+ds-3) ... Selecting previously unselected package libpotrace0:armhf. Preparing to unpack .../089-libpotrace0_1.16-2+b2_armhf.deb ... Unpacking libpotrace0:armhf (1.16-2+b2) ... Selecting previously unselected package libteckit0:armhf. Preparing to unpack .../090-libteckit0_2.5.12+ds1-1+b1_armhf.deb ... Unpacking libteckit0:armhf (2.5.12+ds1-1+b1) ... Selecting previously unselected package x11-common. Preparing to unpack .../091-x11-common_1%3a7.7+23.1_all.deb ... Unpacking x11-common (1:7.7+23.1) ... Selecting previously unselected package libice6:armhf. Preparing to unpack .../092-libice6_2%3a1.1.1-1_armhf.deb ... Unpacking libice6:armhf (2:1.1.1-1) ... Selecting previously unselected package libsm6:armhf. Preparing to unpack .../093-libsm6_2%3a1.2.4-1_armhf.deb ... Unpacking libsm6:armhf (2:1.2.4-1) ... Selecting previously unselected package libxt6t64:armhf. Preparing to unpack .../094-libxt6t64_1%3a1.2.1-1.2+b1_armhf.deb ... Unpacking libxt6t64:armhf (1:1.2.1-1.2+b1) ... Selecting previously unselected package libxmu6:armhf. Preparing to unpack .../095-libxmu6_2%3a1.1.3-3+b3_armhf.deb ... Unpacking libxmu6:armhf (2:1.1.3-3+b3) ... Selecting previously unselected package libxpm4:armhf. Preparing to unpack .../096-libxpm4_1%3a3.5.17-1+b2_armhf.deb ... Unpacking libxpm4:armhf (1:3.5.17-1+b2) ... Selecting previously unselected package libxaw7:armhf. Preparing to unpack .../097-libxaw7_2%3a1.0.14-1+b3_armhf.deb ... Unpacking libxaw7:armhf (2:1.0.14-1+b3) ... Selecting previously unselected package libxi6:armhf. Preparing to unpack .../098-libxi6_2%3a1.8.2-1_armhf.deb ... Unpacking libxi6:armhf (2:1.8.2-1) ... Selecting previously unselected package libzzip-0-13t64:armhf. Preparing to unpack .../099-libzzip-0-13t64_0.13.72+dfsg.1-1.2+b1_armhf.deb ... Unpacking libzzip-0-13t64:armhf (0.13.72+dfsg.1-1.2+b1) ... Selecting previously unselected package texlive-binaries. Preparing to unpack .../100-texlive-binaries_2024.20240313.70630+ds-4_armhf.deb ... Unpacking texlive-binaries (2024.20240313.70630+ds-4) ... Selecting previously unselected package xdg-utils. Preparing to unpack .../101-xdg-utils_1.1.3-4.1_all.deb ... Unpacking xdg-utils (1.1.3-4.1) ... Selecting previously unselected package texlive-base. Preparing to unpack .../102-texlive-base_2024.20240829-2_all.deb ... Unpacking texlive-base (2024.20240829-2) ... Selecting previously unselected package texlive-latex-base. Preparing to unpack .../103-texlive-latex-base_2024.20240829-2_all.deb ... Unpacking texlive-latex-base (2024.20240829-2) ... Selecting previously unselected package latexmk. Preparing to unpack .../104-latexmk_1%3a4.85-1_all.deb ... Unpacking latexmk (1:4.85-1) ... Selecting previously unselected package libapache-pom-java. Preparing to unpack .../105-libapache-pom-java_33-2_all.deb ... Unpacking libapache-pom-java (33-2) ... Selecting previously unselected package libblas3:armhf. Preparing to unpack .../106-libblas3_3.12.0-3+b1_armhf.deb ... Unpacking libblas3:armhf (3.12.0-3+b1) ... Selecting previously unselected package libcommons-parent-java. Preparing to unpack .../107-libcommons-parent-java_56-1_all.deb ... Unpacking libcommons-parent-java (56-1) ... Selecting previously unselected package libcommons-logging-java. Preparing to unpack .../108-libcommons-logging-java_1.3.0-1_all.deb ... Unpacking libcommons-logging-java (1.3.0-1) ... Selecting previously unselected package libexpat1-dev:armhf. Preparing to unpack .../109-libexpat1-dev_2.6.3-2_armhf.deb ... Unpacking libexpat1-dev:armhf (2.6.3-2) ... Selecting previously unselected package libfontbox-java. Preparing to unpack .../110-libfontbox-java_1%3a1.8.16-5_all.deb ... Unpacking libfontbox-java (1:1.8.16-5) ... Selecting previously unselected package libfontenc1:armhf. Preparing to unpack .../111-libfontenc1_1%3a1.1.8-1+b1_armhf.deb ... Unpacking libfontenc1:armhf (1:1.1.8-1+b1) ... Selecting previously unselected package libnuma1:armhf. Preparing to unpack .../112-libnuma1_2.0.18-1+b1_armhf.deb ... Unpacking libnuma1:armhf (2.0.18-1+b1) ... Selecting previously unselected package libnuma-dev:armhf. Preparing to unpack .../113-libnuma-dev_2.0.18-1+b1_armhf.deb ... Unpacking libnuma-dev:armhf (2.0.18-1+b1) ... Selecting previously unselected package libltdl7:armhf. Preparing to unpack .../114-libltdl7_2.4.7-7+b2_armhf.deb ... Unpacking libltdl7:armhf (2.4.7-7+b2) ... Selecting previously unselected package libltdl-dev:armhf. Preparing to unpack .../115-libltdl-dev_2.4.7-7+b2_armhf.deb ... Unpacking libltdl-dev:armhf (2.4.7-7+b2) ... Selecting previously unselected package libhwloc-dev:armhf. Preparing to unpack .../116-libhwloc-dev_2.11.2-1_armhf.deb ... Unpacking libhwloc-dev:armhf (2.11.2-1) ... Selecting previously unselected package libjs-jquery. Preparing to unpack .../117-libjs-jquery_3.6.1+dfsg+~3.5.14-1_all.deb ... Unpacking libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... Selecting previously unselected package libjs-underscore. Preparing to unpack .../118-libjs-underscore_1.13.4~dfsg+~1.11.4-3_all.deb ... Unpacking libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... Selecting previously unselected package libjs-sphinxdoc. Preparing to unpack .../119-libjs-sphinxdoc_7.4.7-4_all.deb ... Unpacking libjs-sphinxdoc (7.4.7-4) ... Selecting previously unselected package libjson-perl. Preparing to unpack .../120-libjson-perl_4.10000-1_all.deb ... Unpacking libjson-perl (4.10000-1) ... Selecting previously unselected package liblapack3:armhf. Preparing to unpack .../121-liblapack3_3.12.0-3+b1_armhf.deb ... Unpacking liblapack3:armhf (3.12.0-3+b1) ... Selecting previously unselected package libmpich12:armhf. Preparing to unpack .../122-libmpich12_4.2.0-14_armhf.deb ... Unpacking libmpich12:armhf (4.2.0-14) ... Selecting previously unselected package libslurm41t64. Preparing to unpack .../123-libslurm41t64_24.05.2-1_armhf.deb ... Unpacking libslurm41t64 (24.05.2-1) ... Selecting previously unselected package mpich. Preparing to unpack .../124-mpich_4.2.0-14_armhf.deb ... Unpacking mpich (4.2.0-14) ... Selecting previously unselected package libmpich-dev:armhf. Preparing to unpack .../125-libmpich-dev_4.2.0-14_armhf.deb ... Unpacking libmpich-dev:armhf (4.2.0-14) ... Selecting previously unselected package libpdfbox-java. Preparing to unpack .../126-libpdfbox-java_1%3a1.8.16-5_all.deb ... Unpacking libpdfbox-java (1:1.8.16-5) ... Selecting previously unselected package libpython3.12t64:armhf. Preparing to unpack .../127-libpython3.12t64_3.12.6-1_armhf.deb ... Unpacking libpython3.12t64:armhf (3.12.6-1) ... Selecting previously unselected package zlib1g-dev:armhf. Preparing to unpack .../128-zlib1g-dev_1%3a1.3.dfsg+really1.3.1-1+b1_armhf.deb ... Unpacking zlib1g-dev:armhf (1:1.3.dfsg+really1.3.1-1+b1) ... Selecting previously unselected package libpython3.12-dev:armhf. Preparing to unpack .../129-libpython3.12-dev_3.12.6-1_armhf.deb ... Unpacking libpython3.12-dev:armhf (3.12.6-1) ... Selecting previously unselected package libpython3-dev:armhf. Preparing to unpack .../130-libpython3-dev_3.12.6-1_armhf.deb ... Unpacking libpython3-dev:armhf (3.12.6-1) ... Selecting previously unselected package libpython3-all-dev:armhf. Preparing to unpack .../131-libpython3-all-dev_3.12.6-1_armhf.deb ... Unpacking libpython3-all-dev:armhf (3.12.6-1) ... Selecting previously unselected package libtext-unidecode-perl. Preparing to unpack .../132-libtext-unidecode-perl_1.30-3_all.deb ... Unpacking libtext-unidecode-perl (1.30-3) ... Selecting previously unselected package libxml-namespacesupport-perl. Preparing to unpack .../133-libxml-namespacesupport-perl_1.12-2_all.deb ... Unpacking libxml-namespacesupport-perl (1.12-2) ... Selecting previously unselected package libxml-sax-base-perl. Preparing to unpack .../134-libxml-sax-base-perl_1.09-3_all.deb ... Unpacking libxml-sax-base-perl (1.09-3) ... Selecting previously unselected package libxml-sax-perl. Preparing to unpack .../135-libxml-sax-perl_1.02+dfsg-3_all.deb ... Unpacking libxml-sax-perl (1.02+dfsg-3) ... Selecting previously unselected package libxml-libxml-perl. Preparing to unpack .../136-libxml-libxml-perl_2.0207+dfsg+really+2.0134-5+b1_armhf.deb ... Unpacking libxml-libxml-perl (2.0207+dfsg+really+2.0134-5+b1) ... Selecting previously unselected package mpi-default-bin. Preparing to unpack .../137-mpi-default-bin_1.17_armhf.deb ... Unpacking mpi-default-bin (1.17) ... Selecting previously unselected package mpi-default-dev. Preparing to unpack .../138-mpi-default-dev_1.17_armhf.deb ... Unpacking mpi-default-dev (1.17) ... Selecting previously unselected package preview-latex-style. Preparing to unpack .../139-preview-latex-style_13.2-1_all.deb ... Unpacking preview-latex-style (13.2-1) ... Selecting previously unselected package python-babel-localedata. Preparing to unpack .../140-python-babel-localedata_2.14.0-1_all.deb ... Unpacking python-babel-localedata (2.14.0-1) ... Selecting previously unselected package python-numpy-doc. Preparing to unpack .../141-python-numpy-doc_1%3a1.26.4+ds-11_all.deb ... Unpacking python-numpy-doc (1:1.26.4+ds-11) ... Selecting previously unselected package python3-alabaster. Preparing to unpack .../142-python3-alabaster_0.7.16-0.1_all.deb ... Unpacking python3-alabaster (0.7.16-0.1) ... Selecting previously unselected package python3-all. Preparing to unpack .../143-python3-all_3.12.6-1_armhf.deb ... Unpacking python3-all (3.12.6-1) ... Selecting previously unselected package python3.12-dev. Preparing to unpack .../144-python3.12-dev_3.12.6-1_armhf.deb ... Unpacking python3.12-dev (3.12.6-1) ... Selecting previously unselected package python3-dev. Preparing to unpack .../145-python3-dev_3.12.6-1_armhf.deb ... Unpacking python3-dev (3.12.6-1) ... Selecting previously unselected package python3-all-dev. Preparing to unpack .../146-python3-all-dev_3.12.6-1_armhf.deb ... Unpacking python3-all-dev (3.12.6-1) ... Selecting previously unselected package python3-babel. Preparing to unpack .../147-python3-babel_2.14.0-1_all.deb ... Unpacking python3-babel (2.14.0-1) ... Selecting previously unselected package python3-certifi. Preparing to unpack .../148-python3-certifi_2024.8.30+dfsg-1_all.deb ... Unpacking python3-certifi (2024.8.30+dfsg-1) ... Selecting previously unselected package python3-chardet. Preparing to unpack .../149-python3-chardet_5.2.0+dfsg-1_all.deb ... Unpacking python3-chardet (5.2.0+dfsg-1) ... Selecting previously unselected package python3-charset-normalizer. Preparing to unpack .../150-python3-charset-normalizer_3.4.0-1_armhf.deb ... Unpacking python3-charset-normalizer (3.4.0-1) ... Selecting previously unselected package python3-defusedxml. Preparing to unpack .../151-python3-defusedxml_0.7.1-2_all.deb ... Unpacking python3-defusedxml (0.7.1-2) ... Selecting previously unselected package python3.12-doc. Preparing to unpack .../152-python3.12-doc_3.12.6-1_all.deb ... Unpacking python3.12-doc (3.12.6-1) ... Selecting previously unselected package python3-doc. Preparing to unpack .../153-python3-doc_3.12.6-1_all.deb ... Unpacking python3-doc (3.12.6-1) ... Selecting previously unselected package python3-roman. Preparing to unpack .../154-python3-roman_4.2-1_all.deb ... Unpacking python3-roman (4.2-1) ... Selecting previously unselected package python3-docutils. Preparing to unpack .../155-python3-docutils_0.21.2+dfsg-2_all.deb ... Unpacking python3-docutils (0.21.2+dfsg-2) ... Selecting previously unselected package python3-idna. Preparing to unpack .../156-python3-idna_3.8-2_all.deb ... Unpacking python3-idna (3.8-2) ... Selecting previously unselected package python3-imagesize. Preparing to unpack .../157-python3-imagesize_1.4.1-1_all.deb ... Unpacking python3-imagesize (1.4.1-1) ... Selecting previously unselected package python3-markupsafe. Preparing to unpack .../158-python3-markupsafe_2.1.5-1+b1_armhf.deb ... Unpacking python3-markupsafe (2.1.5-1+b1) ... Selecting previously unselected package python3-jinja2. Preparing to unpack .../159-python3-jinja2_3.1.3-1_all.deb ... Unpacking python3-jinja2 (3.1.3-1) ... Selecting previously unselected package python3-numpy. Preparing to unpack .../160-python3-numpy_1%3a1.26.4+ds-11_armhf.deb ... Unpacking python3-numpy (1:1.26.4+ds-11) ... Selecting previously unselected package python3-packaging. Preparing to unpack .../161-python3-packaging_24.1-1_all.deb ... Unpacking python3-packaging (24.1-1) ... Selecting previously unselected package python3-pygments. Preparing to unpack .../162-python3-pygments_2.18.0+dfsg-1_all.deb ... Unpacking python3-pygments (2.18.0+dfsg-1) ... Selecting previously unselected package python3-urllib3. Preparing to unpack .../163-python3-urllib3_2.0.7-2_all.deb ... Unpacking python3-urllib3 (2.0.7-2) ... Selecting previously unselected package python3-requests. Preparing to unpack .../164-python3-requests_2.32.3+dfsg-1_all.deb ... Unpacking python3-requests (2.32.3+dfsg-1) ... Selecting previously unselected package python3-snowballstemmer. Preparing to unpack .../165-python3-snowballstemmer_2.2.0-4_all.deb ... Unpacking python3-snowballstemmer (2.2.0-4) ... Selecting previously unselected package sphinx-common. Preparing to unpack .../166-sphinx-common_7.4.7-4_all.deb ... Unpacking sphinx-common (7.4.7-4) ... Selecting previously unselected package python3-sphinx. Preparing to unpack .../167-python3-sphinx_7.4.7-4_all.deb ... Unpacking python3-sphinx (7.4.7-4) ... Selecting previously unselected package xfonts-encodings. Preparing to unpack .../168-xfonts-encodings_1%3a1.0.4-2.2_all.deb ... Unpacking xfonts-encodings (1:1.0.4-2.2) ... Selecting previously unselected package xfonts-utils. Preparing to unpack .../169-xfonts-utils_1%3a7.7+7_armhf.deb ... Unpacking xfonts-utils (1:7.7+7) ... Selecting previously unselected package tex-gyre. Preparing to unpack .../170-tex-gyre_20180621-6_all.deb ... Unpacking tex-gyre (20180621-6) ... Selecting previously unselected package texinfo-lib. Preparing to unpack .../171-texinfo-lib_7.1.1-1+b1_armhf.deb ... Unpacking texinfo-lib (7.1.1-1+b1) ... Selecting previously unselected package texinfo. Preparing to unpack .../172-texinfo_7.1.1-1_all.deb ... Unpacking texinfo (7.1.1-1) ... Selecting previously unselected package texlive-fonts-recommended. Preparing to unpack .../173-texlive-fonts-recommended_2024.20240829-2_all.deb ... Unpacking texlive-fonts-recommended (2024.20240829-2) ... Selecting previously unselected package texlive-latex-recommended. Preparing to unpack .../174-texlive-latex-recommended_2024.20240829-2_all.deb ... Unpacking texlive-latex-recommended (2024.20240829-2) ... Selecting previously unselected package texlive. Preparing to unpack .../175-texlive_2024.20240829-2_all.deb ... Unpacking texlive (2024.20240829-2) ... Selecting previously unselected package texlive-pictures. Preparing to unpack .../176-texlive-pictures_2024.20240829-2_all.deb ... Unpacking texlive-pictures (2024.20240829-2) ... Selecting previously unselected package texlive-latex-extra. Preparing to unpack .../177-texlive-latex-extra_2024.20240829-1_all.deb ... Unpacking texlive-latex-extra (2024.20240829-1) ... Setting up media-types (10.1.0) ... Setting up libpipeline1:armhf (1.5.8-1) ... Setting up libgraphite2-3:armhf (1.3.14-2+b1) ... Setting up libpixman-1-0:armhf (0.42.2-1+b1) ... Setting up libxau6:armhf (1:1.0.11-1) ... Setting up libxdmcp6:armhf (1:1.1.2-3+b2) ... Setting up libkeyutils1:armhf (1.6.3-4) ... Setting up libxcb1:armhf (1.17.0-2+b1) ... Setting up libicu72:armhf (72.1-5+b1) ... Setting up bsdextrautils (2.40.2-9) ... Setting up libmagic-mgc (1:5.45-3+b1) ... Setting up libxcb-render0:armhf (1.17.0-2+b1) ... Setting up libcbor0.10:armhf (0.10.2-2) ... Setting up libarchive-zip-perl (1.68-1) ... Setting up libtirpc-common (1.3.4+ds-1.3) ... Setting up libdebhelper-perl (13.20) ... Setting up libbrotli1:armhf (1.1.0-2+b5) ... Setting up libfontbox-java (1:1.8.16-5) ... Setting up libedit2:armhf (3.1-20240808-1) ... Setting up libmagic1t64:armhf (1:5.45-3+b1) ... Setting up x11-common (1:7.7+23.1) ... invoke-rc.d: could not determine current runlevel Setting up X socket directories... /tmp/.X11-unix /tmp/.ICE-unix. Setting up libxml-namespacesupport-perl (1.12-2) ... Setting up gettext-base (0.22.5-2) ... Setting up m4 (1.4.19-4) ... Setting up libxcb-shm0:armhf (1.17.0-2+b1) ... Setting up libcom-err2:armhf (1.47.1-1+b1) ... Setting up file (1:5.45-3+b1) ... Setting up texinfo-lib (7.1.1-1+b1) ... Setting up libelf1t64:armhf (0.192-4) ... Setting up python-babel-localedata (2.14.0-1) ... Setting up libkrb5support0:armhf (1.21.3-3) ... Setting up tzdata (2024a-4) ... Current default time zone: 'Etc/UTC' Local time is now: Thu Nov 7 10:10:57 UTC 2024. Universal Time is now: Thu Nov 7 10:10:57 UTC 2024. Run 'dpkg-reconfigure tzdata' if you wish to change it. Setting up libxml-sax-base-perl (1.09-3) ... Setting up libfontenc1:armhf (1:1.1.8-1+b1) ... Setting up autotools-dev (20220109.1) ... Setting up libglib2.0-0t64:armhf (2.82.2-2) ... No schema files found: doing nothing. Setting up libblas3:armhf (3.12.0-3+b1) ... update-alternatives: using /usr/lib/arm-linux-gnueabihf/blas/libblas.so.3 to provide /usr/lib/arm-linux-gnueabihf/libblas.so.3 (libblas.so.3-arm-linux-gnueabihf) in auto mode Setting up libexpat1-dev:armhf (2.6.3-2) ... Setting up libzzip-0-13t64:armhf (0.13.72+dfsg.1-1.2+b1) ... Setting up libx11-data (2:1.8.7-1) ... Setting up libteckit0:armhf (2.5.12+ds1-1+b1) ... Setting up libapache-pom-java (33-2) ... Setting up xfonts-encodings (1:1.0.4-2.2) ... Setting up t1utils (1.41-4) ... Setting up libtexlua53-5:armhf (2024.20240313.70630+ds-4) ... Setting up fonts-dejavu-mono (2.37-8) ... Setting up libpng16-16t64:armhf (1.6.44-2) ... Setting up libhwloc15:armhf (2.11.2-1) ... Setting up autopoint (0.22.5-2) ... Setting up libmpfi0:armhf (1.5.4+ds-3) ... Setting up fonts-dejavu-core (2.37-8) ... Setting up libk5crypto3:armhf (1.21.3-3) ... Setting up libltdl7:armhf (2.4.7-7+b2) ... Setting up libkpathsea6:armhf (2024.20240313.70630+ds-4) ... Setting up libgfortran5:armhf (14.2.0-6) ... Setting up autoconf (2.72-3) ... Setting up zlib1g-dev:armhf (1:1.3.dfsg+really1.3.1-1+b1) ... Setting up libnuma1:armhf (2.0.18-1+b1) ... Setting up dwz (0.15-1+b2) ... Setting up libmpich12:armhf (4.2.0-14) ... Setting up sensible-utils (0.0.24) ... Setting up libuchardet0:armhf (0.0.8-1+b2) ... Setting up libjson-perl (4.10000-1) ... Setting up fonts-lmodern (2.005-1) ... Setting up libx11-6:armhf (2:1.8.7-1+b2) ... Setting up libslurm41t64 (24.05.2-1) ... Setting up netbase (6.4) ... Setting up sgml-base (1.31) ... Setting up libkrb5-3:armhf (1.21.3-3) ... Setting up libjs-jquery (3.6.1+dfsg+~3.5.14-1) ... Setting up libfido2-1:armhf (1.15.0-1+b1) ... Setting up libtext-unidecode-perl (1.30-3) ... Setting up openssl (3.3.2-2) ... Setting up readline-common (8.2-5) ... Setting up libxml2:armhf (2.12.7+dfsg+really2.9.14-0.1) ... Setting up xdg-utils (1.1.3-4.1) ... update-alternatives: using /usr/bin/xdg-open to provide /usr/bin/open (open) in auto mode Setting up libsynctex2:armhf (2024.20240313.70630+ds-4) ... Setting up libjs-underscore (1.13.4~dfsg+~1.11.4-3) ... Setting up libpotrace0:armhf (1.16-2+b2) ... Setting up automake (1:1.16.5-1.3) ... update-alternatives: using /usr/bin/automake-1.16 to provide /usr/bin/automake (automake) in auto mode Setting up libgfortran-14-dev:armhf (14.2.0-6) ... Setting up libfile-stripnondeterminism-perl (1.14.0-1) ... Setting up libice6:armhf (2:1.1.1-1) ... Setting up liblapack3:armhf (3.12.0-3+b1) ... update-alternatives: using /usr/lib/arm-linux-gnueabihf/lapack/liblapack.so.3 to provide /usr/lib/arm-linux-gnueabihf/liblapack.so.3 (liblapack.so.3-arm-linux-gnueabihf) in auto mode Setting up gettext (0.22.5-2) ... Setting up libpdfbox-java (1:1.8.16-5) ... Setting up libxpm4:armhf (1:3.5.17-1+b2) ... Setting up libxrender1:armhf (1:0.9.10-1.1+b2) ... Setting up libtool (2.4.7-7) ... Setting up fontconfig-config (2.15.0-1.1+b1) ... Setting up libcommons-parent-java (56-1) ... Setting up hwloc-nox (2.11.2-1) ... Setting up libcommons-logging-java (1.3.0-1) ... Setting up libxext6:armhf (2:1.3.4-1+b2) ... Setting up intltool-debian (0.35.0+20060710.6) ... Setting up libnuma-dev:armhf (2.0.18-1+b1) ... Setting up dh-autoreconf (20) ... Setting up libltdl-dev:armhf (2.4.7-7+b2) ... Setting up ca-certificates (20240203) ... Updating certificates in /etc/ssl/certs... 146 added, 0 removed; done. Setting up libptexenc1:armhf (2024.20240313.70630+ds-4) ... Setting up libfreetype6:armhf (2.13.3+dfsg-1) ... Setting up libgssapi-krb5-2:armhf (1.21.3-3) ... Setting up ucf (3.0043+nmu1) ... Setting up libjs-sphinxdoc (7.4.7-4) ... Setting up libreadline8t64:armhf (8.2-5) ... Setting up dh-strip-nondeterminism (1.14.0-1) ... Setting up groff-base (1.23.0-5) ... Setting up xml-core (0.19) ... Setting up libharfbuzz0b:armhf (10.0.1-1) ... Setting up libhwloc-dev:armhf (2.11.2-1) ... Setting up libfontconfig1:armhf (2.15.0-1.1+b1) ... Setting up python3.12-doc (3.12.6-1) ... Setting up libsm6:armhf (2:1.2.4-1) ... Setting up python3-doc (3.12.6-1) ... Setting up gfortran-14-arm-linux-gnueabihf (14.2.0-6) ... Setting up python-numpy-doc (1:1.26.4+ds-11) ... Setting up libpaper1:armhf (1.1.29+b2) ... Creating config file /etc/papersize with new version Setting up libxi6:armhf (2:1.8.2-1) ... Setting up libtirpc3t64:armhf (1.3.4+ds-1.3+b1) ... Setting up mpich (4.2.0-14) ... update-alternatives: using /usr/bin/mpicc.mpich to provide /usr/bin/mpicc (mpi) in auto mode update-alternatives: using /usr/bin/mpirun.mpich to provide /usr/bin/mpirun (mpirun) in auto mode Setting up openssh-client (1:9.9p1-3) ... Setting up po-debconf (1.0.21+nmu1) ... Setting up mpi-default-bin (1.17) ... Setting up libpaper-utils (1.1.29+b2) ... Setting up xfonts-utils (1:7.7+7) ... Setting up man-db (2.13.0-1) ... Not building database; man-db/auto-update is not 'true'. Setting up libxml-sax-perl (1.02+dfsg-3) ... update-perl-sax-parsers: Registering Perl SAX parser XML::SAX::PurePerl with priority 10... update-perl-sax-parsers: Updating overall Perl SAX parser modules info file... Creating config file /etc/perl/XML/SAX/ParserDetails.ini with new version Setting up libcairo2:armhf (1.18.2-2) ... Setting up tex-common (6.18) ... update-language: texlive-base not installed and configured, doing nothing! Setting up sphinx-common (7.4.7-4) ... Setting up libxt6t64:armhf (1:1.2.1-1.2+b1) ... Setting up libxml-libxml-perl (2.0207+dfsg+really+2.0134-5+b1) ... update-perl-sax-parsers: Registering Perl SAX parser XML::LibXML::SAX::Parser with priority 50... update-perl-sax-parsers: Registering Perl SAX parser XML::LibXML::SAX with priority 50... update-perl-sax-parsers: Updating overall Perl SAX parser modules info file... Replacing config file /etc/perl/XML/SAX/ParserDetails.ini with new version Setting up libnsl2:armhf (1.3.0-3+b3) ... Setting up gfortran-14 (14.2.0-6) ... Setting up tex-gyre (20180621-6) ... Setting up libxmu6:armhf (2:1.1.3-3+b3) ... Setting up libpython3.12-stdlib:armhf (3.12.6-1) ... Setting up preview-latex-style (13.2-1) ... Setting up python3.12 (3.12.6-1) ... Setting up debhelper (13.20) ... Setting up libxaw7:armhf (2:1.0.14-1+b3) ... Setting up libpython3.12t64:armhf (3.12.6-1) ... Setting up libmpich-dev:armhf (4.2.0-14) ... update-alternatives: using /usr/include/arm-linux-gnueabihf/mpich to provide /usr/include/arm-linux-gnueabihf/mpi (mpi-arm-linux-gnueabihf) in auto mode Setting up texinfo (7.1.1-1) ... Running mktexlsr. This may take some time. ... done. Setting up texlive-binaries (2024.20240313.70630+ds-4) ... update-alternatives: using /usr/bin/xdvi-xaw to provide /usr/bin/xdvi.bin (xdvi.bin) in auto mode update-alternatives: using /usr/bin/bibtex.original to provide /usr/bin/bibtex (bibtex) in auto mode Setting up mpi-default-dev (1.17) ... Setting up texlive-base (2024.20240829-2) ... tl-paper: setting paper size for dvips to a4: /var/lib/texmf/dvips/config/config-paper.ps tl-paper: setting paper size for dvipdfmx to a4: /var/lib/texmf/dvipdfmx/dvipdfmx-paper.cfg tl-paper: setting paper size for xdvi to a4: /var/lib/texmf/xdvi/XDvi-paper tl-paper: setting paper size for pdftex to a4: /var/lib/texmf/tex/generic/tex-ini-files/pdftexconfig.tex Setting up libpython3-stdlib:armhf (3.12.6-1) ... Setting up python3 (3.12.6-1) ... Setting up libpython3.12-dev:armhf (3.12.6-1) ... Setting up python3-zipp (3.20.2-1) ... Setting up python3-autocommand (2.2.2-3) ... Setting up python3-markupsafe (2.1.5-1+b1) ... Setting up python3-roman (4.2-1) ... Setting up python3-jinja2 (3.1.3-1) ... Setting up python3-packaging (24.1-1) ... Setting up texlive-latex-base (2024.20240829-2) ... Setting up python3-certifi (2024.8.30+dfsg-1) ... Setting up python3-snowballstemmer (2.2.0-4) ... Setting up texlive-latex-recommended (2024.20240829-2) ... Setting up python3-idna (3.8-2) ... Setting up python3.12-dev (3.12.6-1) ... Setting up cython3 (3.0.11+dfsg-1) ... Setting up python3-typing-extensions (4.12.2-2) ... Setting up texlive-pictures (2024.20240829-2) ... Setting up python3-urllib3 (2.0.7-2) ... Setting up texlive-fonts-recommended (2024.20240829-2) ... Setting up python3-imagesize (1.4.1-1) ... Setting up python3-more-itertools (10.5.0-1) ... Setting up libpython3-dev:armhf (3.12.6-1) ... Setting up python3-jaraco.functools (4.1.0-1) ... Setting up python3-jaraco.context (6.0.0-1) ... Setting up python3-defusedxml (0.7.1-2) ... Setting up texlive (2024.20240829-2) ... Setting up python3-charset-normalizer (3.4.0-1) ... Setting up python3-alabaster (0.7.16-0.1) ... Setting up python3-typeguard (4.4.1-1) ... Setting up latexmk (1:4.85-1) ... Setting up python3-all (3.12.6-1) ... Setting up texlive-latex-extra (2024.20240829-1) ... Setting up python3-inflect (7.3.1-2) ... Setting up libpython3-all-dev:armhf (3.12.6-1) ... Setting up python3-dev (3.12.6-1) ... Setting up python3-pkg-resources (74.1.2-2) ... Setting up python3-all-dev (3.12.6-1) ... Setting up python3-setuptools (74.1.2-2) ... Setting up python3-babel (2.14.0-1) ... update-alternatives: using /usr/bin/pybabel-python3 to provide /usr/bin/pybabel (pybabel) in auto mode Setting up python3-pygments (2.18.0+dfsg-1) ... Setting up python3-chardet (5.2.0+dfsg-1) ... Setting up python3-requests (2.32.3+dfsg-1) ... Setting up python3-numpy (1:1.26.4+ds-11) ... Setting up dh-python (6.20241024) ... Processing triggers for libc-bin (2.40-3) ... Processing triggers for sgml-base (1.31) ... Setting up docutils-common (0.21.2+dfsg-2) ... Processing triggers for sgml-base (1.31) ... Setting up python3-docutils (0.21.2+dfsg-2) ... Setting up python3-sphinx (7.4.7-4) ... Processing triggers for ca-certificates (20240203) ... Updating certificates in /etc/ssl/certs... 0 added, 0 removed; done. Running hooks in /etc/ca-certificates/update.d... done. Processing triggers for tex-common (6.18) ... Running updmap-sys. This may take some time... done. Running mktexlsr /var/lib/texmf ... done. Building format(s) --all. This may take some time... done. Reading package lists... Building dependency tree... Reading state information... Reading extended state information... Initializing package states... Writing extended state information... Building tag database... -> Finished parsing the build-deps I: Building the package I: Running cd /build/reproducible-path/mpi4py-4.0.0/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-genchanges -S > ../mpi4py_4.0.0-8_source.changes dpkg-buildpackage: info: source package mpi4py dpkg-buildpackage: info: source version 4.0.0-8 dpkg-buildpackage: info: source distribution unstable dpkg-buildpackage: info: source changed by Drew Parsons dpkg-source --before-build . dpkg-buildpackage: info: host architecture armhf debian/rules clean dh clean --with sphinxdoc --buildsystem pybuild debian/rules override_dh_auto_clean make[1]: Entering directory '/build/reproducible-path/mpi4py-4.0.0' dh_auto_clean override_dh_auto_clean I: pybuild base:311: python3.12 setup.py clean running clean removing '/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' (and everything under it) rm -rf `find -name build -type d` rm -rf `find -name _build -type d` rm -rf docs/source/reference : # Remove Cython generated files rm -f src/mpi4py.MPE.c src/mpi4py.MPI.c src/include/mpi4py/mpi4py.MPI_api.h make[1]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0' dh_autoreconf_clean -O--buildsystem=pybuild dh_clean -O--buildsystem=pybuild debian/rules binary dh binary --with sphinxdoc --buildsystem pybuild dh_update_autotools_config -O--buildsystem=pybuild debian/rules override_dh_autoreconf make[1]: Entering directory '/build/reproducible-path/mpi4py-4.0.0' dh_autoreconf debian/rules -- cythonize make[2]: Entering directory '/build/reproducible-path/mpi4py-4.0.0' D: removing previously generated by Cython sources find -iname *.c | xargs grep -l 'Generated by Cython' | xargs -r rm python3 setup.py build_src running build_src using Cython 3.0.11 cythonizing 'src/mpi4py/MPI.pyx' -> 'src/mpi4py/MPI.c' make[2]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0' make[1]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0' dh_auto_configure -O--buildsystem=pybuild I: pybuild base:311: python3.12 setup.py config running config MPI configuration: [mpi] from 'mpi.cfg' MPI C compiler: /usr/bin/mpicc MPI C++ compiler: /usr/bin/mpicxx /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -c _configtest.c -o _configtest.o /usr/bin/mpicc -Wl,-z,relro -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 _configtest.o -o _configtest success! removing: _configtest.c _configtest.o _configtest arm-linux-gnueabihf-g++ -g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -c _configtest.cxx -o _configtest.o _configtest.cxx:1:10: fatal error: mpi.h: No such file or directory 1 | #include | ^~~~~~~ compilation terminated. failure. removing: _configtest.cxx _configtest.o debian/rules override_dh_auto_build-arch make[1]: Entering directory '/build/reproducible-path/mpi4py-4.0.0' dh_auto_build override_dh_auto_build-arch -- \ --build-args "--mpicc=/usr/bin/mpicc --mpicxx=/usr/bin/mpicxx" I: pybuild base:311: /usr/bin/python3 setup.py build --mpicc=/usr/bin/mpicc --mpicxx=/usr/bin/mpicxx running build running build_src running build_py creating /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/run.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/bench.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/__init__.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/typing.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/__main__.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py creating /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/aplus.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/server.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/_base.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/pool.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/util.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/__init__.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/_core.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/__main__.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures creating /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util copying src/mpi4py/util/sync.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util copying src/mpi4py/util/pkl5.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util copying src/mpi4py/util/pool.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util copying src/mpi4py/util/__init__.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util copying src/mpi4py/util/dtlib.py -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util copying src/mpi4py/typing.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/run.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/MPI.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/__main__.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/bench.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/__init__.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/py.typed -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/MPI.pxd -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/__init__.pxd -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/libmpi.pxd -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/MPI.h -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/MPI_api.h -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py creating /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/include creating /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/include/mpi4py copying src/mpi4py/include/mpi4py/pycapi.h -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/include/mpi4py copying src/mpi4py/include/mpi4py/mpi4py.h -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/include/mpi4py copying src/mpi4py/include/mpi4py/mpi4py.i -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/include/mpi4py copying src/mpi4py/include/mpi4py/mpi.pxi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/include/mpi4py copying src/mpi4py/util/dtlib.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util copying src/mpi4py/util/pool.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util copying src/mpi4py/util/sync.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util copying src/mpi4py/util/pkl5.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util copying src/mpi4py/util/__init__.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util copying src/mpi4py/futures/pool.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/__main__.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/aplus.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/server.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/util.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/_core.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/_base.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures copying src/mpi4py/futures/__init__.pyi -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures running build_ext MPI configuration: [mpi] from 'mpi.cfg' MPI C compiler: /usr/bin/mpicc MPI C++ compiler: /usr/bin/mpicxx checking for MPI compile and link ... /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.12 -c _configtest.c -o _configtest.o success! removing: _configtest.c _configtest.o /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.12 -c _configtest.c -o _configtest.o /usr/bin/mpicc -Wl,-z,relro -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 _configtest.o -L/usr/lib/arm-linux-gnueabihf -o _configtest success! removing: _configtest.c _configtest.o _configtest checking for MPI ABI support ... /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.12 -c _configtest.c -o _configtest.o failure. removing: _configtest.c _configtest.o /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.12 -c _configtest.c -o _configtest.o success! removing: _configtest.c _configtest.o checking for missing MPI functions/symbols ... checking for function 'MPI_Type_create_f90_integer' ... /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.12 -c _configtest.c -o _configtest.o /usr/bin/mpicc -Wl,-z,relro -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 _configtest.o -L/usr/lib/arm-linux-gnueabihf -o _configtest success! removing: _configtest.c _configtest.o _configtest checking for function 'MPI_Type_create_f90_real' ... /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.12 -c _configtest.c -o _configtest.o /usr/bin/mpicc -Wl,-z,relro -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 _configtest.o -L/usr/lib/arm-linux-gnueabihf -o _configtest success! removing: _configtest.c _configtest.o _configtest checking for function 'MPI_Type_create_f90_complex' ... /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.12 -c _configtest.c -o _configtest.o /usr/bin/mpicc -Wl,-z,relro -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 _configtest.o -L/usr/lib/arm-linux-gnueabihf -o _configtest success! removing: _configtest.c _configtest.o _configtest checking for function 'MPI_Status_c2f' ... /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.12 -c _configtest.c -o _configtest.o /usr/bin/mpicc -Wl,-z,relro -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 _configtest.o -L/usr/lib/arm-linux-gnueabihf -o _configtest success! removing: _configtest.c _configtest.o _configtest checking for function 'MPI_Status_f2c' ... /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.12 -c _configtest.c -o _configtest.o /usr/bin/mpicc -Wl,-z,relro -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 _configtest.o -L/usr/lib/arm-linux-gnueabihf -o _configtest success! removing: _configtest.c _configtest.o _configtest checking for dlopen() availability ... checking for header 'dlfcn.h' ... /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.12 -c _configtest.c -o _configtest.o success! removing: _configtest.c _configtest.o success! checking for library 'dl' ... /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.12 -c _configtest.c -o _configtest.o /usr/bin/mpicc -Wl,-z,relro -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 _configtest.o -L/usr/lib/arm-linux-gnueabihf -ldl -o _configtest success! removing: _configtest.c _configtest.o _configtest checking for function 'dlopen' ... /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.12 -c _configtest.c -o _configtest.o /usr/bin/mpicc -Wl,-z,relro -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 _configtest.o -L/usr/lib/arm-linux-gnueabihf -ldl -o _configtest success! removing: _configtest.c _configtest.o _configtest building 'mpi4py.MPI' extension creating build creating build/temp.linux-armv7l-cpython-312 creating build/temp.linux-armv7l-cpython-312/src creating build/temp.linux-armv7l-cpython-312/src/mpi4py /usr/bin/mpicc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O2 -Wall -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DHAVE_DLFCN_H=1 -DHAVE_DLOPEN=1 -Isrc -I/usr/include/python3.12 -c src/mpi4py/MPI.c -o build/temp.linux-armv7l-cpython-312/src/mpi4py/MPI.o src/mpi4py/MPI.c: In function '__pyx_pymod_exec_MPI': src/mpi4py/MPI.c:284274:30: note: variable tracking size limit exceeded with '-fvar-tracking-assignments', retrying without 284274 | static CYTHON_SMALL_CODE int __pyx_pymod_exec_MPI(PyObject *__pyx_pyinit_module) | ^~~~~~~~~~~~~~~~~~~~ /usr/bin/mpicc -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-z,relro -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-armv7l-cpython-312/src/mpi4py/MPI.o -L/usr/lib/arm-linux-gnueabihf -o /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/MPI.cpython-312-arm-linux-gnueabihf.so make[1]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0' debian/rules override_dh_auto_build-indep make[1]: Entering directory '/build/reproducible-path/mpi4py-4.0.0' dh_auto_build override_dh_auto_build-arch -- \ --build-args "--mpicc=/usr/bin/mpicc --mpicxx=/usr/bin/mpicxx" I: pybuild base:311: /usr/bin/python3 setup.py build --mpicc=/usr/bin/mpicc --mpicxx=/usr/bin/mpicxx running build running build_src running build_py copying src/mpi4py/MPI.h -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/MPI_api.h -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py running build_ext MPI configuration: [mpi] from 'mpi.cfg' MPI C compiler: /usr/bin/mpicc MPI C++ compiler: /usr/bin/mpicxx PYTHONPATH=`pybuild -p 3.12 --print "{build_dir}"` \ make -C docs/source/ html man info latexpdf SPHINXOPTS="-D today=\"September 29, 2024\"" make[2]: Entering directory '/build/reproducible-path/mpi4py-4.0.0/docs/source' Running Sphinx v7.4.7 loading translations [en]... done making output directory... done [autosummary] generating autosummary for: changes.rst, citation.rst, develop.rst, guidelines.rst, index.rst, install.rst, intro.rst, license.rst, mpi4py.MPI.rst, mpi4py.bench.rst, ..., mpi4py.run.rst, mpi4py.typing.rst, mpi4py.util.dtlib.rst, mpi4py.util.pkl5.rst, mpi4py.util.pool.rst, mpi4py.util.rst, mpi4py.util.sync.rst, overview.rst, reference.rst, tutorial.rst [autosummary] generating autosummary for: /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.rst [autosummary] generating autosummary for: /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.AINT.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.ANY_SOURCE.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.ANY_TAG.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.APPNUM.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.Add_error_class.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.Add_error_code.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.Add_error_string.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.Aint_add.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.Aint_diff.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.Alloc_mem.rst, ..., /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.WIN_SIZE.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.WIN_UNIFIED.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.WTIME_IS_GLOBAL.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.Win.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.Wtick.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.Wtime.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.buffer.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.get_vendor.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.memory.rst, /build/reproducible-path/mpi4py-4.0.0/docs/source/reference/mpi4py.MPI.pickle.rst loading intersphinx inventory 'python' from /usr/share/doc/python3/html/objects.inv... loading intersphinx inventory 'numpy' from /usr/share/doc/python-numpy/html/objects.inv... loading intersphinx inventory 'dlpack' from https://dmlc.github.io/dlpack/latest/objects.inv... loading intersphinx inventory 'numba' from https://numba.readthedocs.io/en/stable/objects.inv... WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'https://dmlc.github.io/dlpack/latest/objects.inv' not fetchable due to : HTTPSConnectionPool(host='dmlc.github.io', port=443): Max retries exceeded with url: /dlpack/latest/objects.inv (Caused by ProxyError('Unable to connect to proxy', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))) WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'https://numba.readthedocs.io/en/stable/objects.inv' not fetchable due to : HTTPSConnectionPool(host='numba.readthedocs.io', port=443): Max retries exceeded with url: /en/stable/objects.inv (Caused by ProxyError('Unable to connect to proxy', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))) building [mo]: targets for 0 po files that are out of date writing output... building [html]: targets for 22 source files that are out of date updating environment: [new config] 383 added, 0 changed, 0 removed reading sources... [ 0%] changes reading sources... [ 1%] citation reading sources... [ 1%] develop reading sources... [ 1%] guidelines reading sources... [ 1%] index reading sources... [ 2%] install reading sources... [ 2%] intro reading sources... [ 2%] license reading sources... [ 2%] mpi4py reading sources... [ 3%] mpi4py.MPI reading sources... [ 3%] mpi4py.bench reading sources... [ 3%] mpi4py.futures reading sources... [ 3%] mpi4py.run reading sources... [ 4%] mpi4py.typing reading sources... [ 4%] mpi4py.util reading sources... [ 4%] mpi4py.util.dtlib reading sources... [ 4%] mpi4py.util.pkl5 reading sources... [ 5%] mpi4py.util.pool reading sources... [ 5%] mpi4py.util.sync reading sources... [ 5%] overview reading sources... [ 5%] reference reading sources... [ 6%] reference/mpi4py.MPI reading sources... [ 6%] reference/mpi4py.MPI.AINT reading sources... [ 6%] reference/mpi4py.MPI.ANY_SOURCE reading sources... [ 7%] reference/mpi4py.MPI.ANY_TAG reading sources... [ 7%] reference/mpi4py.MPI.APPNUM reading sources... [ 7%] reference/mpi4py.MPI.Add_error_class reading sources... [ 7%] reference/mpi4py.MPI.Add_error_code reading sources... [ 8%] reference/mpi4py.MPI.Add_error_string reading sources... [ 8%] reference/mpi4py.MPI.Aint_add reading sources... [ 8%] reference/mpi4py.MPI.Aint_diff reading sources... [ 8%] reference/mpi4py.MPI.Alloc_mem reading sources... [ 9%] reference/mpi4py.MPI.Attach_buffer reading sources... [ 9%] reference/mpi4py.MPI.BAND reading sources... [ 9%] reference/mpi4py.MPI.BOOL reading sources... [ 9%] reference/mpi4py.MPI.BOR reading sources... [ 10%] reference/mpi4py.MPI.BOTTOM reading sources... [ 10%] reference/mpi4py.MPI.BSEND_OVERHEAD reading sources... [ 10%] reference/mpi4py.MPI.BUFFER_AUTOMATIC reading sources... [ 10%] reference/mpi4py.MPI.BXOR reading sources... [ 11%] reference/mpi4py.MPI.BYTE reading sources... [ 11%] reference/mpi4py.MPI.BottomType reading sources... [ 11%] reference/mpi4py.MPI.BufferAutomaticType reading sources... [ 11%] reference/mpi4py.MPI.CART reading sources... [ 12%] reference/mpi4py.MPI.CHAR reading sources... [ 12%] reference/mpi4py.MPI.CHARACTER reading sources... [ 12%] reference/mpi4py.MPI.COMBINER_CONTIGUOUS reading sources... [ 13%] reference/mpi4py.MPI.COMBINER_DARRAY reading sources... [ 13%] reference/mpi4py.MPI.COMBINER_DUP reading sources... [ 13%] reference/mpi4py.MPI.COMBINER_F90_COMPLEX reading sources... [ 13%] reference/mpi4py.MPI.COMBINER_F90_INTEGER reading sources... [ 14%] reference/mpi4py.MPI.COMBINER_F90_REAL reading sources... [ 14%] reference/mpi4py.MPI.COMBINER_HINDEXED reading sources... [ 14%] reference/mpi4py.MPI.COMBINER_HINDEXED_BLOCK reading sources... [ 14%] reference/mpi4py.MPI.COMBINER_HVECTOR reading sources... [ 15%] reference/mpi4py.MPI.COMBINER_INDEXED reading sources... [ 15%] reference/mpi4py.MPI.COMBINER_INDEXED_BLOCK reading sources... [ 15%] reference/mpi4py.MPI.COMBINER_NAMED reading sources... [ 15%] reference/mpi4py.MPI.COMBINER_RESIZED reading sources... [ 16%] reference/mpi4py.MPI.COMBINER_STRUCT reading sources... [ 16%] reference/mpi4py.MPI.COMBINER_SUBARRAY reading sources... [ 16%] reference/mpi4py.MPI.COMBINER_VALUE_INDEX reading sources... [ 16%] reference/mpi4py.MPI.COMBINER_VECTOR reading sources... [ 17%] reference/mpi4py.MPI.COMM_NULL reading sources... [ 17%] reference/mpi4py.MPI.COMM_SELF reading sources... [ 17%] reference/mpi4py.MPI.COMM_TYPE_HW_GUIDED reading sources... [ 17%] reference/mpi4py.MPI.COMM_TYPE_HW_UNGUIDED reading sources... [ 18%] reference/mpi4py.MPI.COMM_TYPE_RESOURCE_GUIDED reading sources... [ 18%] reference/mpi4py.MPI.COMM_TYPE_SHARED reading sources... [ 18%] reference/mpi4py.MPI.COMM_WORLD reading sources... [ 19%] reference/mpi4py.MPI.COMPLEX reading sources... [ 19%] reference/mpi4py.MPI.COMPLEX16 reading sources... [ 19%] reference/mpi4py.MPI.COMPLEX32 reading sources... [ 19%] reference/mpi4py.MPI.COMPLEX4 reading sources... [ 20%] reference/mpi4py.MPI.COMPLEX8 reading sources... [ 20%] reference/mpi4py.MPI.CONGRUENT reading sources... [ 20%] reference/mpi4py.MPI.COUNT reading sources... [ 20%] reference/mpi4py.MPI.CXX_BOOL reading sources... [ 21%] reference/mpi4py.MPI.CXX_DOUBLE_COMPLEX reading sources... [ 21%] reference/mpi4py.MPI.CXX_FLOAT_COMPLEX reading sources... [ 21%] reference/mpi4py.MPI.CXX_LONG_DOUBLE_COMPLEX reading sources... [ 21%] reference/mpi4py.MPI.C_BOOL reading sources... [ 22%] reference/mpi4py.MPI.C_COMPLEX reading sources... [ 22%] reference/mpi4py.MPI.C_DOUBLE_COMPLEX reading sources... [ 22%] reference/mpi4py.MPI.C_FLOAT_COMPLEX reading sources... [ 22%] reference/mpi4py.MPI.C_LONG_DOUBLE_COMPLEX reading sources... [ 23%] reference/mpi4py.MPI.Cartcomm reading sources... [ 23%] reference/mpi4py.MPI.Close_port reading sources... [ 23%] reference/mpi4py.MPI.Comm reading sources... [ 23%] reference/mpi4py.MPI.Compute_dims reading sources... [ 24%] reference/mpi4py.MPI.DATATYPE_NULL reading sources... [ 24%] reference/mpi4py.MPI.DISPLACEMENT_CURRENT reading sources... [ 24%] reference/mpi4py.MPI.DISP_CUR reading sources... [ 25%] reference/mpi4py.MPI.DISTRIBUTE_BLOCK reading sources... [ 25%] reference/mpi4py.MPI.DISTRIBUTE_CYCLIC reading sources... [ 25%] reference/mpi4py.MPI.DISTRIBUTE_DFLT_DARG reading sources... [ 25%] reference/mpi4py.MPI.DISTRIBUTE_NONE reading sources... [ 26%] reference/mpi4py.MPI.DIST_GRAPH reading sources... [ 26%] reference/mpi4py.MPI.DOUBLE reading sources... [ 26%] reference/mpi4py.MPI.DOUBLE_COMPLEX reading sources... [ 26%] reference/mpi4py.MPI.DOUBLE_INT reading sources... [ 27%] reference/mpi4py.MPI.DOUBLE_PRECISION reading sources... [ 27%] reference/mpi4py.MPI.Datatype reading sources... [ 27%] reference/mpi4py.MPI.Detach_buffer reading sources... [ 27%] reference/mpi4py.MPI.Distgraphcomm reading sources... [ 28%] reference/mpi4py.MPI.ERRHANDLER_NULL reading sources... [ 28%] reference/mpi4py.MPI.ERRORS_ABORT reading sources... [ 28%] reference/mpi4py.MPI.ERRORS_ARE_FATAL reading sources... [ 28%] reference/mpi4py.MPI.ERRORS_RETURN reading sources... [ 29%] reference/mpi4py.MPI.ERR_ACCESS reading sources... [ 29%] reference/mpi4py.MPI.ERR_AMODE reading sources... [ 29%] reference/mpi4py.MPI.ERR_ARG reading sources... [ 30%] reference/mpi4py.MPI.ERR_ASSERT reading sources... [ 30%] reference/mpi4py.MPI.ERR_BAD_FILE reading sources... [ 30%] reference/mpi4py.MPI.ERR_BASE reading sources... [ 30%] reference/mpi4py.MPI.ERR_BUFFER reading sources... [ 31%] reference/mpi4py.MPI.ERR_COMM reading sources... [ 31%] reference/mpi4py.MPI.ERR_CONVERSION reading sources... [ 31%] reference/mpi4py.MPI.ERR_COUNT reading sources... [ 31%] reference/mpi4py.MPI.ERR_DIMS reading sources... [ 32%] reference/mpi4py.MPI.ERR_DISP reading sources... [ 32%] reference/mpi4py.MPI.ERR_DUP_DATAREP reading sources... [ 32%] reference/mpi4py.MPI.ERR_ERRHANDLER reading sources... [ 32%] reference/mpi4py.MPI.ERR_FILE reading sources... [ 33%] reference/mpi4py.MPI.ERR_FILE_EXISTS reading sources... [ 33%] reference/mpi4py.MPI.ERR_FILE_IN_USE reading sources... [ 33%] reference/mpi4py.MPI.ERR_GROUP reading sources... [ 33%] reference/mpi4py.MPI.ERR_INFO reading sources... [ 34%] reference/mpi4py.MPI.ERR_INFO_KEY reading sources... [ 34%] reference/mpi4py.MPI.ERR_INFO_NOKEY reading sources... [ 34%] reference/mpi4py.MPI.ERR_INFO_VALUE reading sources... [ 34%] reference/mpi4py.MPI.ERR_INTERN reading sources... [ 35%] reference/mpi4py.MPI.ERR_IN_STATUS reading sources... [ 35%] reference/mpi4py.MPI.ERR_IO reading sources... [ 35%] reference/mpi4py.MPI.ERR_KEYVAL reading sources... [ 36%] reference/mpi4py.MPI.ERR_LASTCODE reading sources... [ 36%] reference/mpi4py.MPI.ERR_LOCKTYPE reading sources... [ 36%] reference/mpi4py.MPI.ERR_NAME reading sources... [ 36%] reference/mpi4py.MPI.ERR_NOT_SAME reading sources... [ 37%] reference/mpi4py.MPI.ERR_NO_MEM reading sources... [ 37%] reference/mpi4py.MPI.ERR_NO_SPACE reading sources... [ 37%] reference/mpi4py.MPI.ERR_NO_SUCH_FILE reading sources... [ 37%] reference/mpi4py.MPI.ERR_OP reading sources... [ 38%] reference/mpi4py.MPI.ERR_OTHER reading sources... [ 38%] reference/mpi4py.MPI.ERR_PENDING reading sources... [ 38%] reference/mpi4py.MPI.ERR_PORT reading sources... [ 38%] reference/mpi4py.MPI.ERR_PROC_ABORTED reading sources... [ 39%] reference/mpi4py.MPI.ERR_PROC_FAILED reading sources... [ 39%] reference/mpi4py.MPI.ERR_PROC_FAILED_PENDING reading sources... [ 39%] reference/mpi4py.MPI.ERR_QUOTA reading sources... [ 39%] reference/mpi4py.MPI.ERR_RANK reading sources... [ 40%] reference/mpi4py.MPI.ERR_READ_ONLY reading sources... [ 40%] reference/mpi4py.MPI.ERR_REQUEST reading sources... [ 40%] reference/mpi4py.MPI.ERR_REVOKED reading sources... [ 40%] reference/mpi4py.MPI.ERR_RMA_ATTACH reading sources... [ 41%] reference/mpi4py.MPI.ERR_RMA_CONFLICT reading sources... [ 41%] reference/mpi4py.MPI.ERR_RMA_FLAVOR reading sources... [ 41%] reference/mpi4py.MPI.ERR_RMA_RANGE reading sources... [ 42%] reference/mpi4py.MPI.ERR_RMA_SHARED reading sources... [ 42%] reference/mpi4py.MPI.ERR_RMA_SYNC reading sources... [ 42%] reference/mpi4py.MPI.ERR_ROOT reading sources... [ 42%] reference/mpi4py.MPI.ERR_SERVICE reading sources... [ 43%] reference/mpi4py.MPI.ERR_SESSION reading sources... [ 43%] reference/mpi4py.MPI.ERR_SIZE reading sources... [ 43%] reference/mpi4py.MPI.ERR_SPAWN reading sources... [ 43%] reference/mpi4py.MPI.ERR_TAG reading sources... [ 44%] reference/mpi4py.MPI.ERR_TOPOLOGY reading sources... [ 44%] reference/mpi4py.MPI.ERR_TRUNCATE reading sources... [ 44%] reference/mpi4py.MPI.ERR_TYPE reading sources... [ 44%] reference/mpi4py.MPI.ERR_UNKNOWN reading sources... [ 45%] reference/mpi4py.MPI.ERR_UNSUPPORTED_DATAREP reading sources... [ 45%] reference/mpi4py.MPI.ERR_UNSUPPORTED_OPERATION reading sources... [ 45%] reference/mpi4py.MPI.ERR_VALUE_TOO_LARGE reading sources... [ 45%] reference/mpi4py.MPI.ERR_WIN reading sources... [ 46%] reference/mpi4py.MPI.Errhandler reading sources... [ 46%] reference/mpi4py.MPI.Exception reading sources... [ 46%] reference/mpi4py.MPI.FILE_NULL reading sources... [ 46%] reference/mpi4py.MPI.FLOAT reading sources... [ 47%] reference/mpi4py.MPI.FLOAT_INT reading sources... [ 47%] reference/mpi4py.MPI.F_BOOL reading sources... [ 47%] reference/mpi4py.MPI.F_COMPLEX reading sources... [ 48%] reference/mpi4py.MPI.F_DOUBLE reading sources... [ 48%] reference/mpi4py.MPI.F_DOUBLE_COMPLEX reading sources... [ 48%] reference/mpi4py.MPI.F_ERROR reading sources... [ 48%] reference/mpi4py.MPI.F_FLOAT reading sources... [ 49%] reference/mpi4py.MPI.F_FLOAT_COMPLEX reading sources... [ 49%] reference/mpi4py.MPI.F_INT reading sources... [ 49%] reference/mpi4py.MPI.F_SOURCE reading sources... [ 49%] reference/mpi4py.MPI.F_STATUS_SIZE reading sources... [ 50%] reference/mpi4py.MPI.F_TAG reading sources... [ 50%] reference/mpi4py.MPI.File reading sources... [ 50%] reference/mpi4py.MPI.Finalize reading sources... [ 50%] reference/mpi4py.MPI.Flush_buffer reading sources... [ 51%] reference/mpi4py.MPI.Free_mem reading sources... [ 51%] reference/mpi4py.MPI.GRAPH reading sources... [ 51%] reference/mpi4py.MPI.GROUP_EMPTY reading sources... [ 51%] reference/mpi4py.MPI.GROUP_NULL reading sources... [ 52%] reference/mpi4py.MPI.Get_address reading sources... [ 52%] reference/mpi4py.MPI.Get_error_class reading sources... [ 52%] reference/mpi4py.MPI.Get_error_string reading sources... [ 52%] reference/mpi4py.MPI.Get_hw_resource_info reading sources... [ 53%] reference/mpi4py.MPI.Get_library_version reading sources... [ 53%] reference/mpi4py.MPI.Get_processor_name reading sources... [ 53%] reference/mpi4py.MPI.Get_version reading sources... [ 54%] reference/mpi4py.MPI.Graphcomm reading sources... [ 54%] reference/mpi4py.MPI.Grequest reading sources... [ 54%] reference/mpi4py.MPI.Group reading sources... [ 54%] reference/mpi4py.MPI.IDENT reading sources... [ 55%] reference/mpi4py.MPI.INFO_ENV reading sources... [ 55%] reference/mpi4py.MPI.INFO_NULL reading sources... [ 55%] reference/mpi4py.MPI.INT reading sources... [ 55%] reference/mpi4py.MPI.INT16_T reading sources... [ 56%] reference/mpi4py.MPI.INT32_T reading sources... [ 56%] reference/mpi4py.MPI.INT64_T reading sources... [ 56%] reference/mpi4py.MPI.INT8_T reading sources... [ 56%] reference/mpi4py.MPI.INTEGER reading sources... [ 57%] reference/mpi4py.MPI.INTEGER1 reading sources... [ 57%] reference/mpi4py.MPI.INTEGER16 reading sources... [ 57%] reference/mpi4py.MPI.INTEGER2 reading sources... [ 57%] reference/mpi4py.MPI.INTEGER4 reading sources... [ 58%] reference/mpi4py.MPI.INTEGER8 reading sources... [ 58%] reference/mpi4py.MPI.INT_INT reading sources... [ 58%] reference/mpi4py.MPI.IN_PLACE reading sources... [ 58%] reference/mpi4py.MPI.IO reading sources... [ 59%] reference/mpi4py.MPI.Iflush_buffer reading sources... [ 59%] reference/mpi4py.MPI.InPlaceType reading sources... [ 59%] reference/mpi4py.MPI.Info reading sources... [ 60%] reference/mpi4py.MPI.Init reading sources... [ 60%] reference/mpi4py.MPI.Init_thread reading sources... [ 60%] reference/mpi4py.MPI.Intercomm reading sources... [ 60%] reference/mpi4py.MPI.Intracomm reading sources... [ 61%] reference/mpi4py.MPI.Is_finalized reading sources... [ 61%] reference/mpi4py.MPI.Is_initialized reading sources... [ 61%] reference/mpi4py.MPI.Is_thread_main reading sources... [ 61%] reference/mpi4py.MPI.KEYVAL_INVALID reading sources... [ 62%] reference/mpi4py.MPI.LAND reading sources... [ 62%] reference/mpi4py.MPI.LASTUSEDCODE reading sources... [ 62%] reference/mpi4py.MPI.LOCK_EXCLUSIVE reading sources... [ 62%] reference/mpi4py.MPI.LOCK_SHARED reading sources... [ 63%] reference/mpi4py.MPI.LOGICAL reading sources... [ 63%] reference/mpi4py.MPI.LOGICAL1 reading sources... [ 63%] reference/mpi4py.MPI.LOGICAL2 reading sources... [ 63%] reference/mpi4py.MPI.LOGICAL4 reading sources... [ 64%] reference/mpi4py.MPI.LOGICAL8 reading sources... [ 64%] reference/mpi4py.MPI.LONG reading sources... [ 64%] reference/mpi4py.MPI.LONG_DOUBLE reading sources... [ 64%] reference/mpi4py.MPI.LONG_DOUBLE_INT reading sources... [ 65%] reference/mpi4py.MPI.LONG_INT reading sources... [ 65%] reference/mpi4py.MPI.LONG_LONG reading sources... [ 65%] reference/mpi4py.MPI.LOR reading sources... [ 66%] reference/mpi4py.MPI.LXOR reading sources... [ 66%] reference/mpi4py.MPI.Lookup_name reading sources... [ 66%] reference/mpi4py.MPI.MAX reading sources... [ 66%] reference/mpi4py.MPI.MAXLOC reading sources... [ 67%] reference/mpi4py.MPI.MAX_DATAREP_STRING reading sources... [ 67%] reference/mpi4py.MPI.MAX_ERROR_STRING reading sources... [ 67%] reference/mpi4py.MPI.MAX_INFO_KEY reading sources... [ 67%] reference/mpi4py.MPI.MAX_INFO_VAL reading sources... [ 68%] reference/mpi4py.MPI.MAX_LIBRARY_VERSION_STRING reading sources... [ 68%] reference/mpi4py.MPI.MAX_OBJECT_NAME reading sources... [ 68%] reference/mpi4py.MPI.MAX_PORT_NAME reading sources... [ 68%] reference/mpi4py.MPI.MAX_PROCESSOR_NAME reading sources... [ 69%] reference/mpi4py.MPI.MAX_PSET_NAME_LEN reading sources... [ 69%] reference/mpi4py.MPI.MAX_STRINGTAG_LEN reading sources... [ 69%] reference/mpi4py.MPI.MESSAGE_NO_PROC reading sources... [ 69%] reference/mpi4py.MPI.MESSAGE_NULL reading sources... [ 70%] reference/mpi4py.MPI.MIN reading sources... [ 70%] reference/mpi4py.MPI.MINLOC reading sources... [ 70%] reference/mpi4py.MPI.MODE_APPEND reading sources... [ 70%] reference/mpi4py.MPI.MODE_CREATE reading sources... [ 71%] reference/mpi4py.MPI.MODE_DELETE_ON_CLOSE reading sources... [ 71%] reference/mpi4py.MPI.MODE_EXCL reading sources... [ 71%] reference/mpi4py.MPI.MODE_NOCHECK reading sources... [ 72%] reference/mpi4py.MPI.MODE_NOPRECEDE reading sources... [ 72%] reference/mpi4py.MPI.MODE_NOPUT reading sources... [ 72%] reference/mpi4py.MPI.MODE_NOSTORE reading sources... [ 72%] reference/mpi4py.MPI.MODE_NOSUCCEED reading sources... [ 73%] reference/mpi4py.MPI.MODE_RDONLY reading sources... [ 73%] reference/mpi4py.MPI.MODE_RDWR reading sources... [ 73%] reference/mpi4py.MPI.MODE_SEQUENTIAL reading sources... [ 73%] reference/mpi4py.MPI.MODE_UNIQUE_OPEN reading sources... [ 74%] reference/mpi4py.MPI.MODE_WRONLY reading sources... [ 74%] reference/mpi4py.MPI.Message reading sources... [ 74%] reference/mpi4py.MPI.NO_OP reading sources... [ 74%] reference/mpi4py.MPI.OFFSET reading sources... [ 75%] reference/mpi4py.MPI.OP_NULL reading sources... [ 75%] reference/mpi4py.MPI.ORDER_C reading sources... [ 75%] reference/mpi4py.MPI.ORDER_F reading sources... [ 75%] reference/mpi4py.MPI.ORDER_FORTRAN reading sources... [ 76%] reference/mpi4py.MPI.Op reading sources... [ 76%] reference/mpi4py.MPI.Open_port reading sources... [ 76%] reference/mpi4py.MPI.PACKED reading sources... [ 77%] reference/mpi4py.MPI.PROC_NULL reading sources... [ 77%] reference/mpi4py.MPI.PROD reading sources... [ 77%] reference/mpi4py.MPI.Pcontrol reading sources... [ 77%] reference/mpi4py.MPI.Pickle reading sources... [ 78%] reference/mpi4py.MPI.Prequest reading sources... [ 78%] reference/mpi4py.MPI.Publish_name reading sources... [ 78%] reference/mpi4py.MPI.Query_thread reading sources... [ 78%] reference/mpi4py.MPI.REAL reading sources... [ 79%] reference/mpi4py.MPI.REAL16 reading sources... [ 79%] reference/mpi4py.MPI.REAL2 reading sources... [ 79%] reference/mpi4py.MPI.REAL4 reading sources... [ 79%] reference/mpi4py.MPI.REAL8 reading sources... [ 80%] reference/mpi4py.MPI.REPLACE reading sources... [ 80%] reference/mpi4py.MPI.REQUEST_NULL reading sources... [ 80%] reference/mpi4py.MPI.ROOT reading sources... [ 80%] reference/mpi4py.MPI.Register_datarep reading sources... [ 81%] reference/mpi4py.MPI.Remove_error_class reading sources... [ 81%] reference/mpi4py.MPI.Remove_error_code reading sources... [ 81%] reference/mpi4py.MPI.Remove_error_string reading sources... [ 81%] reference/mpi4py.MPI.Request reading sources... [ 82%] reference/mpi4py.MPI.SEEK_CUR reading sources... [ 82%] reference/mpi4py.MPI.SEEK_END reading sources... [ 82%] reference/mpi4py.MPI.SEEK_SET reading sources... [ 83%] reference/mpi4py.MPI.SESSION_NULL reading sources... [ 83%] reference/mpi4py.MPI.SHORT reading sources... [ 83%] reference/mpi4py.MPI.SHORT_INT reading sources... [ 83%] reference/mpi4py.MPI.SIGNED_CHAR reading sources... [ 84%] reference/mpi4py.MPI.SIGNED_INT reading sources... [ 84%] reference/mpi4py.MPI.SIGNED_LONG reading sources... [ 84%] reference/mpi4py.MPI.SIGNED_LONG_LONG reading sources... [ 84%] reference/mpi4py.MPI.SIGNED_SHORT reading sources... [ 85%] reference/mpi4py.MPI.SIMILAR reading sources... [ 85%] reference/mpi4py.MPI.SINT16_T reading sources... [ 85%] reference/mpi4py.MPI.SINT32_T reading sources... [ 85%] reference/mpi4py.MPI.SINT64_T reading sources... [ 86%] reference/mpi4py.MPI.SINT8_T reading sources... [ 86%] reference/mpi4py.MPI.SUBVERSION reading sources... [ 86%] reference/mpi4py.MPI.SUCCESS reading sources... [ 86%] reference/mpi4py.MPI.SUM reading sources... [ 87%] reference/mpi4py.MPI.Session reading sources... [ 87%] reference/mpi4py.MPI.Status reading sources... [ 87%] reference/mpi4py.MPI.TAG_UB reading sources... [ 87%] reference/mpi4py.MPI.THREAD_FUNNELED reading sources... [ 88%] reference/mpi4py.MPI.THREAD_MULTIPLE reading sources... [ 88%] reference/mpi4py.MPI.THREAD_SERIALIZED reading sources... [ 88%] reference/mpi4py.MPI.THREAD_SINGLE reading sources... [ 89%] reference/mpi4py.MPI.TWOINT reading sources... [ 89%] reference/mpi4py.MPI.TYPECLASS_COMPLEX reading sources... [ 89%] reference/mpi4py.MPI.TYPECLASS_INTEGER reading sources... [ 89%] reference/mpi4py.MPI.TYPECLASS_REAL reading sources... [ 90%] reference/mpi4py.MPI.Topocomm reading sources... [ 90%] reference/mpi4py.MPI.UINT16_T reading sources... [ 90%] reference/mpi4py.MPI.UINT32_T reading sources... [ 90%] reference/mpi4py.MPI.UINT64_T reading sources... [ 91%] reference/mpi4py.MPI.UINT8_T reading sources... [ 91%] reference/mpi4py.MPI.UNDEFINED reading sources... [ 91%] reference/mpi4py.MPI.UNEQUAL reading sources... [ 91%] reference/mpi4py.MPI.UNIVERSE_SIZE reading sources... [ 92%] reference/mpi4py.MPI.UNSIGNED reading sources... [ 92%] reference/mpi4py.MPI.UNSIGNED_CHAR reading sources... [ 92%] reference/mpi4py.MPI.UNSIGNED_INT reading sources... [ 92%] reference/mpi4py.MPI.UNSIGNED_LONG reading sources... [ 93%] reference/mpi4py.MPI.UNSIGNED_LONG_LONG reading sources... [ 93%] reference/mpi4py.MPI.UNSIGNED_SHORT reading sources... [ 93%] reference/mpi4py.MPI.UNWEIGHTED reading sources... [ 93%] reference/mpi4py.MPI.Unpublish_name reading sources... [ 94%] reference/mpi4py.MPI.VERSION reading sources... [ 94%] reference/mpi4py.MPI.WCHAR reading sources... [ 94%] reference/mpi4py.MPI.WEIGHTS_EMPTY reading sources... [ 95%] reference/mpi4py.MPI.WIN_BASE reading sources... [ 95%] reference/mpi4py.MPI.WIN_CREATE_FLAVOR reading sources... [ 95%] reference/mpi4py.MPI.WIN_DISP_UNIT reading sources... [ 95%] reference/mpi4py.MPI.WIN_FLAVOR reading sources... [ 96%] reference/mpi4py.MPI.WIN_FLAVOR_ALLOCATE reading sources... [ 96%] reference/mpi4py.MPI.WIN_FLAVOR_CREATE reading sources... [ 96%] reference/mpi4py.MPI.WIN_FLAVOR_DYNAMIC reading sources... [ 96%] reference/mpi4py.MPI.WIN_FLAVOR_SHARED reading sources... [ 97%] reference/mpi4py.MPI.WIN_MODEL reading sources... [ 97%] reference/mpi4py.MPI.WIN_NULL reading sources... [ 97%] reference/mpi4py.MPI.WIN_SEPARATE reading sources... [ 97%] reference/mpi4py.MPI.WIN_SIZE reading sources... [ 98%] reference/mpi4py.MPI.WIN_UNIFIED reading sources... [ 98%] reference/mpi4py.MPI.WTIME_IS_GLOBAL reading sources... [ 98%] reference/mpi4py.MPI.Win reading sources... [ 98%] reference/mpi4py.MPI.Wtick reading sources... [ 99%] reference/mpi4py.MPI.Wtime reading sources... [ 99%] reference/mpi4py.MPI.buffer reading sources... [ 99%] reference/mpi4py.MPI.get_vendor reading sources... [ 99%] reference/mpi4py.MPI.memory reading sources... [100%] reference/mpi4py.MPI.pickle reading sources... [100%] tutorial looking for now-outdated files... none found pickling environment... done checking consistency... done preparing documents... done copying assets... copying static files... done copying extra files... done copying assets: done writing output... [ 0%] changes writing output... [ 1%] citation writing output... [ 1%] develop writing output... [ 1%] guidelines writing output... [ 1%] index writing output... [ 2%] install writing output... [ 2%] intro writing output... [ 2%] license writing output... [ 2%] mpi4py writing output... [ 3%] mpi4py.MPI writing output... [ 3%] mpi4py.bench writing output... [ 3%] mpi4py.futures writing output... [ 3%] mpi4py.run writing output... [ 4%] mpi4py.typing writing output... [ 4%] mpi4py.util writing output... [ 4%] mpi4py.util.dtlib writing output... [ 4%] mpi4py.util.pkl5 writing output... [ 5%] mpi4py.util.pool writing output... [ 5%] mpi4py.util.sync writing output... [ 5%] overview writing output... [ 5%] reference writing output... [ 6%] reference/mpi4py.MPI writing output... [ 6%] reference/mpi4py.MPI.AINT writing output... [ 6%] reference/mpi4py.MPI.ANY_SOURCE writing output... [ 7%] reference/mpi4py.MPI.ANY_TAG writing output... [ 7%] reference/mpi4py.MPI.APPNUM writing output... [ 7%] reference/mpi4py.MPI.Add_error_class writing output... [ 7%] reference/mpi4py.MPI.Add_error_code writing output... [ 8%] reference/mpi4py.MPI.Add_error_string writing output... [ 8%] reference/mpi4py.MPI.Aint_add writing output... [ 8%] reference/mpi4py.MPI.Aint_diff writing output... [ 8%] reference/mpi4py.MPI.Alloc_mem writing output... [ 9%] reference/mpi4py.MPI.Attach_buffer writing output... [ 9%] reference/mpi4py.MPI.BAND writing output... [ 9%] reference/mpi4py.MPI.BOOL writing output... [ 9%] reference/mpi4py.MPI.BOR writing output... [ 10%] reference/mpi4py.MPI.BOTTOM writing output... [ 10%] reference/mpi4py.MPI.BSEND_OVERHEAD writing output... [ 10%] reference/mpi4py.MPI.BUFFER_AUTOMATIC writing output... [ 10%] reference/mpi4py.MPI.BXOR writing output... [ 11%] reference/mpi4py.MPI.BYTE writing output... [ 11%] reference/mpi4py.MPI.BottomType writing output... [ 11%] reference/mpi4py.MPI.BufferAutomaticType writing output... [ 11%] reference/mpi4py.MPI.CART writing output... [ 12%] reference/mpi4py.MPI.CHAR writing output... [ 12%] reference/mpi4py.MPI.CHARACTER writing output... [ 12%] reference/mpi4py.MPI.COMBINER_CONTIGUOUS writing output... [ 13%] reference/mpi4py.MPI.COMBINER_DARRAY writing output... [ 13%] reference/mpi4py.MPI.COMBINER_DUP writing output... [ 13%] reference/mpi4py.MPI.COMBINER_F90_COMPLEX writing output... [ 13%] reference/mpi4py.MPI.COMBINER_F90_INTEGER writing output... [ 14%] reference/mpi4py.MPI.COMBINER_F90_REAL writing output... [ 14%] reference/mpi4py.MPI.COMBINER_HINDEXED writing output... [ 14%] reference/mpi4py.MPI.COMBINER_HINDEXED_BLOCK writing output... [ 14%] reference/mpi4py.MPI.COMBINER_HVECTOR writing output... [ 15%] reference/mpi4py.MPI.COMBINER_INDEXED writing output... [ 15%] reference/mpi4py.MPI.COMBINER_INDEXED_BLOCK writing output... [ 15%] reference/mpi4py.MPI.COMBINER_NAMED writing output... [ 15%] reference/mpi4py.MPI.COMBINER_RESIZED writing output... [ 16%] reference/mpi4py.MPI.COMBINER_STRUCT writing output... [ 16%] reference/mpi4py.MPI.COMBINER_SUBARRAY writing output... [ 16%] reference/mpi4py.MPI.COMBINER_VALUE_INDEX writing output... [ 16%] reference/mpi4py.MPI.COMBINER_VECTOR writing output... [ 17%] reference/mpi4py.MPI.COMM_NULL writing output... [ 17%] reference/mpi4py.MPI.COMM_SELF writing output... [ 17%] reference/mpi4py.MPI.COMM_TYPE_HW_GUIDED writing output... [ 17%] reference/mpi4py.MPI.COMM_TYPE_HW_UNGUIDED writing output... [ 18%] reference/mpi4py.MPI.COMM_TYPE_RESOURCE_GUIDED writing output... [ 18%] reference/mpi4py.MPI.COMM_TYPE_SHARED writing output... [ 18%] reference/mpi4py.MPI.COMM_WORLD writing output... [ 19%] reference/mpi4py.MPI.COMPLEX writing output... [ 19%] reference/mpi4py.MPI.COMPLEX16 writing output... [ 19%] reference/mpi4py.MPI.COMPLEX32 writing output... [ 19%] reference/mpi4py.MPI.COMPLEX4 writing output... [ 20%] reference/mpi4py.MPI.COMPLEX8 writing output... [ 20%] reference/mpi4py.MPI.CONGRUENT writing output... [ 20%] reference/mpi4py.MPI.COUNT writing output... [ 20%] reference/mpi4py.MPI.CXX_BOOL writing output... [ 21%] reference/mpi4py.MPI.CXX_DOUBLE_COMPLEX writing output... [ 21%] reference/mpi4py.MPI.CXX_FLOAT_COMPLEX writing output... [ 21%] reference/mpi4py.MPI.CXX_LONG_DOUBLE_COMPLEX writing output... [ 21%] reference/mpi4py.MPI.C_BOOL writing output... [ 22%] reference/mpi4py.MPI.C_COMPLEX writing output... [ 22%] reference/mpi4py.MPI.C_DOUBLE_COMPLEX writing output... [ 22%] reference/mpi4py.MPI.C_FLOAT_COMPLEX writing output... [ 22%] reference/mpi4py.MPI.C_LONG_DOUBLE_COMPLEX writing output... [ 23%] reference/mpi4py.MPI.Cartcomm writing output... [ 23%] reference/mpi4py.MPI.Close_port writing output... [ 23%] reference/mpi4py.MPI.Comm writing output... [ 23%] reference/mpi4py.MPI.Compute_dims writing output... [ 24%] reference/mpi4py.MPI.DATATYPE_NULL writing output... [ 24%] reference/mpi4py.MPI.DISPLACEMENT_CURRENT writing output... [ 24%] reference/mpi4py.MPI.DISP_CUR writing output... [ 25%] reference/mpi4py.MPI.DISTRIBUTE_BLOCK writing output... [ 25%] reference/mpi4py.MPI.DISTRIBUTE_CYCLIC writing output... [ 25%] reference/mpi4py.MPI.DISTRIBUTE_DFLT_DARG writing output... [ 25%] reference/mpi4py.MPI.DISTRIBUTE_NONE writing output... [ 26%] reference/mpi4py.MPI.DIST_GRAPH writing output... [ 26%] reference/mpi4py.MPI.DOUBLE writing output... [ 26%] reference/mpi4py.MPI.DOUBLE_COMPLEX writing output... [ 26%] reference/mpi4py.MPI.DOUBLE_INT writing output... [ 27%] reference/mpi4py.MPI.DOUBLE_PRECISION writing output... [ 27%] reference/mpi4py.MPI.Datatype writing output... [ 27%] reference/mpi4py.MPI.Detach_buffer writing output... [ 27%] reference/mpi4py.MPI.Distgraphcomm writing output... [ 28%] reference/mpi4py.MPI.ERRHANDLER_NULL writing output... [ 28%] reference/mpi4py.MPI.ERRORS_ABORT writing output... [ 28%] reference/mpi4py.MPI.ERRORS_ARE_FATAL writing output... [ 28%] reference/mpi4py.MPI.ERRORS_RETURN writing output... [ 29%] reference/mpi4py.MPI.ERR_ACCESS writing output... [ 29%] reference/mpi4py.MPI.ERR_AMODE writing output... [ 29%] reference/mpi4py.MPI.ERR_ARG writing output... [ 30%] reference/mpi4py.MPI.ERR_ASSERT writing output... [ 30%] reference/mpi4py.MPI.ERR_BAD_FILE writing output... [ 30%] reference/mpi4py.MPI.ERR_BASE writing output... [ 30%] reference/mpi4py.MPI.ERR_BUFFER writing output... [ 31%] reference/mpi4py.MPI.ERR_COMM writing output... [ 31%] reference/mpi4py.MPI.ERR_CONVERSION writing output... [ 31%] reference/mpi4py.MPI.ERR_COUNT writing output... [ 31%] reference/mpi4py.MPI.ERR_DIMS writing output... [ 32%] reference/mpi4py.MPI.ERR_DISP writing output... [ 32%] reference/mpi4py.MPI.ERR_DUP_DATAREP writing output... [ 32%] reference/mpi4py.MPI.ERR_ERRHANDLER writing output... [ 32%] reference/mpi4py.MPI.ERR_FILE writing output... [ 33%] reference/mpi4py.MPI.ERR_FILE_EXISTS writing output... [ 33%] reference/mpi4py.MPI.ERR_FILE_IN_USE writing output... [ 33%] reference/mpi4py.MPI.ERR_GROUP writing output... [ 33%] reference/mpi4py.MPI.ERR_INFO writing output... [ 34%] reference/mpi4py.MPI.ERR_INFO_KEY writing output... [ 34%] reference/mpi4py.MPI.ERR_INFO_NOKEY writing output... [ 34%] reference/mpi4py.MPI.ERR_INFO_VALUE writing output... [ 34%] reference/mpi4py.MPI.ERR_INTERN writing output... [ 35%] reference/mpi4py.MPI.ERR_IN_STATUS writing output... [ 35%] reference/mpi4py.MPI.ERR_IO writing output... [ 35%] reference/mpi4py.MPI.ERR_KEYVAL writing output... [ 36%] reference/mpi4py.MPI.ERR_LASTCODE writing output... [ 36%] reference/mpi4py.MPI.ERR_LOCKTYPE writing output... [ 36%] reference/mpi4py.MPI.ERR_NAME writing output... [ 36%] reference/mpi4py.MPI.ERR_NOT_SAME writing output... [ 37%] reference/mpi4py.MPI.ERR_NO_MEM writing output... [ 37%] reference/mpi4py.MPI.ERR_NO_SPACE writing output... [ 37%] reference/mpi4py.MPI.ERR_NO_SUCH_FILE writing output... [ 37%] reference/mpi4py.MPI.ERR_OP writing output... [ 38%] reference/mpi4py.MPI.ERR_OTHER writing output... [ 38%] reference/mpi4py.MPI.ERR_PENDING writing output... [ 38%] reference/mpi4py.MPI.ERR_PORT writing output... [ 38%] reference/mpi4py.MPI.ERR_PROC_ABORTED writing output... [ 39%] reference/mpi4py.MPI.ERR_PROC_FAILED writing output... [ 39%] reference/mpi4py.MPI.ERR_PROC_FAILED_PENDING writing output... [ 39%] reference/mpi4py.MPI.ERR_QUOTA writing output... [ 39%] reference/mpi4py.MPI.ERR_RANK writing output... [ 40%] reference/mpi4py.MPI.ERR_READ_ONLY writing output... [ 40%] reference/mpi4py.MPI.ERR_REQUEST writing output... [ 40%] reference/mpi4py.MPI.ERR_REVOKED writing output... [ 40%] reference/mpi4py.MPI.ERR_RMA_ATTACH writing output... [ 41%] reference/mpi4py.MPI.ERR_RMA_CONFLICT writing output... [ 41%] reference/mpi4py.MPI.ERR_RMA_FLAVOR writing output... [ 41%] reference/mpi4py.MPI.ERR_RMA_RANGE writing output... [ 42%] reference/mpi4py.MPI.ERR_RMA_SHARED writing output... [ 42%] reference/mpi4py.MPI.ERR_RMA_SYNC writing output... [ 42%] reference/mpi4py.MPI.ERR_ROOT writing output... [ 42%] reference/mpi4py.MPI.ERR_SERVICE writing output... [ 43%] reference/mpi4py.MPI.ERR_SESSION writing output... [ 43%] reference/mpi4py.MPI.ERR_SIZE writing output... [ 43%] reference/mpi4py.MPI.ERR_SPAWN writing output... [ 43%] reference/mpi4py.MPI.ERR_TAG writing output... [ 44%] reference/mpi4py.MPI.ERR_TOPOLOGY writing output... [ 44%] reference/mpi4py.MPI.ERR_TRUNCATE writing output... [ 44%] reference/mpi4py.MPI.ERR_TYPE writing output... [ 44%] reference/mpi4py.MPI.ERR_UNKNOWN writing output... [ 45%] reference/mpi4py.MPI.ERR_UNSUPPORTED_DATAREP writing output... [ 45%] reference/mpi4py.MPI.ERR_UNSUPPORTED_OPERATION writing output... [ 45%] reference/mpi4py.MPI.ERR_VALUE_TOO_LARGE writing output... [ 45%] reference/mpi4py.MPI.ERR_WIN writing output... [ 46%] reference/mpi4py.MPI.Errhandler writing output... [ 46%] reference/mpi4py.MPI.Exception writing output... [ 46%] reference/mpi4py.MPI.FILE_NULL writing output... [ 46%] reference/mpi4py.MPI.FLOAT writing output... [ 47%] reference/mpi4py.MPI.FLOAT_INT writing output... [ 47%] reference/mpi4py.MPI.F_BOOL writing output... [ 47%] reference/mpi4py.MPI.F_COMPLEX writing output... [ 48%] reference/mpi4py.MPI.F_DOUBLE writing output... [ 48%] reference/mpi4py.MPI.F_DOUBLE_COMPLEX writing output... [ 48%] reference/mpi4py.MPI.F_ERROR writing output... [ 48%] reference/mpi4py.MPI.F_FLOAT writing output... [ 49%] reference/mpi4py.MPI.F_FLOAT_COMPLEX writing output... [ 49%] reference/mpi4py.MPI.F_INT writing output... [ 49%] reference/mpi4py.MPI.F_SOURCE writing output... [ 49%] reference/mpi4py.MPI.F_STATUS_SIZE writing output... [ 50%] reference/mpi4py.MPI.F_TAG writing output... [ 50%] reference/mpi4py.MPI.File writing output... [ 50%] reference/mpi4py.MPI.Finalize writing output... [ 50%] reference/mpi4py.MPI.Flush_buffer writing output... [ 51%] reference/mpi4py.MPI.Free_mem writing output... [ 51%] reference/mpi4py.MPI.GRAPH writing output... [ 51%] reference/mpi4py.MPI.GROUP_EMPTY writing output... [ 51%] reference/mpi4py.MPI.GROUP_NULL writing output... [ 52%] reference/mpi4py.MPI.Get_address writing output... [ 52%] reference/mpi4py.MPI.Get_error_class writing output... [ 52%] reference/mpi4py.MPI.Get_error_string writing output... [ 52%] reference/mpi4py.MPI.Get_hw_resource_info writing output... [ 53%] reference/mpi4py.MPI.Get_library_version writing output... [ 53%] reference/mpi4py.MPI.Get_processor_name writing output... [ 53%] reference/mpi4py.MPI.Get_version writing output... [ 54%] reference/mpi4py.MPI.Graphcomm writing output... [ 54%] reference/mpi4py.MPI.Grequest writing output... [ 54%] reference/mpi4py.MPI.Group writing output... [ 54%] reference/mpi4py.MPI.IDENT writing output... [ 55%] reference/mpi4py.MPI.INFO_ENV writing output... [ 55%] reference/mpi4py.MPI.INFO_NULL writing output... [ 55%] reference/mpi4py.MPI.INT writing output... [ 55%] reference/mpi4py.MPI.INT16_T writing output... [ 56%] reference/mpi4py.MPI.INT32_T writing output... [ 56%] reference/mpi4py.MPI.INT64_T writing output... [ 56%] reference/mpi4py.MPI.INT8_T writing output... [ 56%] reference/mpi4py.MPI.INTEGER writing output... [ 57%] reference/mpi4py.MPI.INTEGER1 writing output... [ 57%] reference/mpi4py.MPI.INTEGER16 writing output... [ 57%] reference/mpi4py.MPI.INTEGER2 writing output... [ 57%] reference/mpi4py.MPI.INTEGER4 writing output... [ 58%] reference/mpi4py.MPI.INTEGER8 writing output... [ 58%] reference/mpi4py.MPI.INT_INT writing output... [ 58%] reference/mpi4py.MPI.IN_PLACE writing output... [ 58%] reference/mpi4py.MPI.IO writing output... [ 59%] reference/mpi4py.MPI.Iflush_buffer writing output... [ 59%] reference/mpi4py.MPI.InPlaceType writing output... [ 59%] reference/mpi4py.MPI.Info writing output... [ 60%] reference/mpi4py.MPI.Init writing output... [ 60%] reference/mpi4py.MPI.Init_thread writing output... [ 60%] reference/mpi4py.MPI.Intercomm writing output... [ 60%] reference/mpi4py.MPI.Intracomm writing output... [ 61%] reference/mpi4py.MPI.Is_finalized writing output... [ 61%] reference/mpi4py.MPI.Is_initialized writing output... [ 61%] reference/mpi4py.MPI.Is_thread_main writing output... [ 61%] reference/mpi4py.MPI.KEYVAL_INVALID writing output... [ 62%] reference/mpi4py.MPI.LAND writing output... [ 62%] reference/mpi4py.MPI.LASTUSEDCODE writing output... [ 62%] reference/mpi4py.MPI.LOCK_EXCLUSIVE writing output... [ 62%] reference/mpi4py.MPI.LOCK_SHARED writing output... [ 63%] reference/mpi4py.MPI.LOGICAL writing output... [ 63%] reference/mpi4py.MPI.LOGICAL1 writing output... [ 63%] reference/mpi4py.MPI.LOGICAL2 writing output... [ 63%] reference/mpi4py.MPI.LOGICAL4 writing output... [ 64%] reference/mpi4py.MPI.LOGICAL8 writing output... [ 64%] reference/mpi4py.MPI.LONG writing output... [ 64%] reference/mpi4py.MPI.LONG_DOUBLE writing output... [ 64%] reference/mpi4py.MPI.LONG_DOUBLE_INT writing output... [ 65%] reference/mpi4py.MPI.LONG_INT writing output... [ 65%] reference/mpi4py.MPI.LONG_LONG writing output... [ 65%] reference/mpi4py.MPI.LOR writing output... [ 66%] reference/mpi4py.MPI.LXOR writing output... [ 66%] reference/mpi4py.MPI.Lookup_name writing output... [ 66%] reference/mpi4py.MPI.MAX writing output... [ 66%] reference/mpi4py.MPI.MAXLOC writing output... [ 67%] reference/mpi4py.MPI.MAX_DATAREP_STRING writing output... [ 67%] reference/mpi4py.MPI.MAX_ERROR_STRING writing output... [ 67%] reference/mpi4py.MPI.MAX_INFO_KEY writing output... [ 67%] reference/mpi4py.MPI.MAX_INFO_VAL writing output... [ 68%] reference/mpi4py.MPI.MAX_LIBRARY_VERSION_STRING writing output... [ 68%] reference/mpi4py.MPI.MAX_OBJECT_NAME writing output... [ 68%] reference/mpi4py.MPI.MAX_PORT_NAME writing output... [ 68%] reference/mpi4py.MPI.MAX_PROCESSOR_NAME writing output... [ 69%] reference/mpi4py.MPI.MAX_PSET_NAME_LEN writing output... [ 69%] reference/mpi4py.MPI.MAX_STRINGTAG_LEN writing output... [ 69%] reference/mpi4py.MPI.MESSAGE_NO_PROC writing output... [ 69%] reference/mpi4py.MPI.MESSAGE_NULL writing output... [ 70%] reference/mpi4py.MPI.MIN writing output... [ 70%] reference/mpi4py.MPI.MINLOC writing output... [ 70%] reference/mpi4py.MPI.MODE_APPEND writing output... [ 70%] reference/mpi4py.MPI.MODE_CREATE writing output... [ 71%] reference/mpi4py.MPI.MODE_DELETE_ON_CLOSE writing output... [ 71%] reference/mpi4py.MPI.MODE_EXCL writing output... [ 71%] reference/mpi4py.MPI.MODE_NOCHECK writing output... [ 72%] reference/mpi4py.MPI.MODE_NOPRECEDE writing output... [ 72%] reference/mpi4py.MPI.MODE_NOPUT writing output... [ 72%] reference/mpi4py.MPI.MODE_NOSTORE writing output... [ 72%] reference/mpi4py.MPI.MODE_NOSUCCEED writing output... [ 73%] reference/mpi4py.MPI.MODE_RDONLY writing output... [ 73%] reference/mpi4py.MPI.MODE_RDWR writing output... [ 73%] reference/mpi4py.MPI.MODE_SEQUENTIAL writing output... [ 73%] reference/mpi4py.MPI.MODE_UNIQUE_OPEN writing output... [ 74%] reference/mpi4py.MPI.MODE_WRONLY writing output... [ 74%] reference/mpi4py.MPI.Message writing output... [ 74%] reference/mpi4py.MPI.NO_OP writing output... [ 74%] reference/mpi4py.MPI.OFFSET writing output... [ 75%] reference/mpi4py.MPI.OP_NULL writing output... [ 75%] reference/mpi4py.MPI.ORDER_C writing output... [ 75%] reference/mpi4py.MPI.ORDER_F writing output... [ 75%] reference/mpi4py.MPI.ORDER_FORTRAN writing output... [ 76%] reference/mpi4py.MPI.Op writing output... [ 76%] reference/mpi4py.MPI.Open_port writing output... [ 76%] reference/mpi4py.MPI.PACKED writing output... [ 77%] reference/mpi4py.MPI.PROC_NULL writing output... [ 77%] reference/mpi4py.MPI.PROD writing output... [ 77%] reference/mpi4py.MPI.Pcontrol writing output... [ 77%] reference/mpi4py.MPI.Pickle writing output... [ 78%] reference/mpi4py.MPI.Prequest writing output... [ 78%] reference/mpi4py.MPI.Publish_name writing output... [ 78%] reference/mpi4py.MPI.Query_thread writing output... [ 78%] reference/mpi4py.MPI.REAL writing output... [ 79%] reference/mpi4py.MPI.REAL16 writing output... [ 79%] reference/mpi4py.MPI.REAL2 writing output... [ 79%] reference/mpi4py.MPI.REAL4 writing output... [ 79%] reference/mpi4py.MPI.REAL8 writing output... [ 80%] reference/mpi4py.MPI.REPLACE writing output... [ 80%] reference/mpi4py.MPI.REQUEST_NULL writing output... [ 80%] reference/mpi4py.MPI.ROOT writing output... [ 80%] reference/mpi4py.MPI.Register_datarep writing output... [ 81%] reference/mpi4py.MPI.Remove_error_class writing output... [ 81%] reference/mpi4py.MPI.Remove_error_code writing output... [ 81%] reference/mpi4py.MPI.Remove_error_string writing output... [ 81%] reference/mpi4py.MPI.Request writing output... [ 82%] reference/mpi4py.MPI.SEEK_CUR writing output... [ 82%] reference/mpi4py.MPI.SEEK_END writing output... [ 82%] reference/mpi4py.MPI.SEEK_SET writing output... [ 83%] reference/mpi4py.MPI.SESSION_NULL writing output... [ 83%] reference/mpi4py.MPI.SHORT writing output... [ 83%] reference/mpi4py.MPI.SHORT_INT writing output... [ 83%] reference/mpi4py.MPI.SIGNED_CHAR writing output... [ 84%] reference/mpi4py.MPI.SIGNED_INT writing output... [ 84%] reference/mpi4py.MPI.SIGNED_LONG writing output... [ 84%] reference/mpi4py.MPI.SIGNED_LONG_LONG writing output... [ 84%] reference/mpi4py.MPI.SIGNED_SHORT writing output... [ 85%] reference/mpi4py.MPI.SIMILAR writing output... [ 85%] reference/mpi4py.MPI.SINT16_T writing output... [ 85%] reference/mpi4py.MPI.SINT32_T writing output... [ 85%] reference/mpi4py.MPI.SINT64_T writing output... [ 86%] reference/mpi4py.MPI.SINT8_T writing output... [ 86%] reference/mpi4py.MPI.SUBVERSION writing output... [ 86%] reference/mpi4py.MPI.SUCCESS writing output... [ 86%] reference/mpi4py.MPI.SUM writing output... [ 87%] reference/mpi4py.MPI.Session writing output... [ 87%] reference/mpi4py.MPI.Status writing output... [ 87%] reference/mpi4py.MPI.TAG_UB writing output... [ 87%] reference/mpi4py.MPI.THREAD_FUNNELED writing output... [ 88%] reference/mpi4py.MPI.THREAD_MULTIPLE writing output... [ 88%] reference/mpi4py.MPI.THREAD_SERIALIZED writing output... [ 88%] reference/mpi4py.MPI.THREAD_SINGLE writing output... [ 89%] reference/mpi4py.MPI.TWOINT writing output... [ 89%] reference/mpi4py.MPI.TYPECLASS_COMPLEX writing output... [ 89%] reference/mpi4py.MPI.TYPECLASS_INTEGER writing output... [ 89%] reference/mpi4py.MPI.TYPECLASS_REAL writing output... [ 90%] reference/mpi4py.MPI.Topocomm writing output... [ 90%] reference/mpi4py.MPI.UINT16_T writing output... [ 90%] reference/mpi4py.MPI.UINT32_T writing output... [ 90%] reference/mpi4py.MPI.UINT64_T writing output... [ 91%] reference/mpi4py.MPI.UINT8_T writing output... [ 91%] reference/mpi4py.MPI.UNDEFINED writing output... [ 91%] reference/mpi4py.MPI.UNEQUAL writing output... [ 91%] reference/mpi4py.MPI.UNIVERSE_SIZE writing output... [ 92%] reference/mpi4py.MPI.UNSIGNED writing output... [ 92%] reference/mpi4py.MPI.UNSIGNED_CHAR writing output... [ 92%] reference/mpi4py.MPI.UNSIGNED_INT writing output... [ 92%] reference/mpi4py.MPI.UNSIGNED_LONG writing output... [ 93%] reference/mpi4py.MPI.UNSIGNED_LONG_LONG writing output... [ 93%] reference/mpi4py.MPI.UNSIGNED_SHORT writing output... [ 93%] reference/mpi4py.MPI.UNWEIGHTED writing output... [ 93%] reference/mpi4py.MPI.Unpublish_name writing output... [ 94%] reference/mpi4py.MPI.VERSION writing output... [ 94%] reference/mpi4py.MPI.WCHAR writing output... [ 94%] reference/mpi4py.MPI.WEIGHTS_EMPTY writing output... [ 95%] reference/mpi4py.MPI.WIN_BASE writing output... [ 95%] reference/mpi4py.MPI.WIN_CREATE_FLAVOR writing output... [ 95%] reference/mpi4py.MPI.WIN_DISP_UNIT writing output... [ 95%] reference/mpi4py.MPI.WIN_FLAVOR writing output... [ 96%] reference/mpi4py.MPI.WIN_FLAVOR_ALLOCATE writing output... [ 96%] reference/mpi4py.MPI.WIN_FLAVOR_CREATE writing output... [ 96%] reference/mpi4py.MPI.WIN_FLAVOR_DYNAMIC writing output... [ 96%] reference/mpi4py.MPI.WIN_FLAVOR_SHARED writing output... [ 97%] reference/mpi4py.MPI.WIN_MODEL writing output... [ 97%] reference/mpi4py.MPI.WIN_NULL writing output... [ 97%] reference/mpi4py.MPI.WIN_SEPARATE writing output... [ 97%] reference/mpi4py.MPI.WIN_SIZE writing output... [ 98%] reference/mpi4py.MPI.WIN_UNIFIED writing output... [ 98%] reference/mpi4py.MPI.WTIME_IS_GLOBAL writing output... [ 98%] reference/mpi4py.MPI.Win writing output... [ 98%] reference/mpi4py.MPI.Wtick writing output... [ 99%] reference/mpi4py.MPI.Wtime writing output... [ 99%] reference/mpi4py.MPI.buffer writing output... [ 99%] reference/mpi4py.MPI.get_vendor writing output... [ 99%] reference/mpi4py.MPI.memory writing output... [100%] reference/mpi4py.MPI.pickle writing output... [100%] tutorial /build/reproducible-path/mpi4py-4.0.0/docs/source/docstring of mpi4py.typing.SupportsDLPack:3: WARNING: undefined label: 'dlpack:python-spec' /build/reproducible-path/mpi4py-4.0.0/docs/source/docstring of mpi4py.typing.SupportsCAI:3: WARNING: undefined label: 'numba:cuda-array-interface' /build/reproducible-path/mpi4py-4.0.0/docs/source/overview.rst:219: WARNING: 'any' reference target not found: dlpack:python-spec /build/reproducible-path/mpi4py-4.0.0/docs/source/overview.rst:219: WARNING: 'any' reference target not found: numba:cuda-array-interface generating indices... genindex py-modindex done writing additional pages... search done dumping search index in English (code: en)... done dumping object inventory... done build succeeded, 6 warnings. The HTML pages are in _build/html. Running Sphinx v7.4.7 loading translations [en]... done making output directory... done loading pickled environment... done [autosummary] generating autosummary for: changes.rst, citation.rst, develop.rst, guidelines.rst, index.rst, install.rst, intro.rst, license.rst, mpi4py.MPI.rst, mpi4py.bench.rst, ..., reference/mpi4py.MPI.WTIME_IS_GLOBAL.rst, reference/mpi4py.MPI.Win.rst, reference/mpi4py.MPI.Wtick.rst, reference/mpi4py.MPI.Wtime.rst, reference/mpi4py.MPI.buffer.rst, reference/mpi4py.MPI.get_vendor.rst, reference/mpi4py.MPI.memory.rst, reference/mpi4py.MPI.pickle.rst, reference/mpi4py.MPI.rst, tutorial.rst loading intersphinx inventory 'python' from /usr/share/doc/python3/html/objects.inv... loading intersphinx inventory 'numpy' from /usr/share/doc/python-numpy/html/objects.inv... loading intersphinx inventory 'dlpack' from https://dmlc.github.io/dlpack/latest/objects.inv... loading intersphinx inventory 'numba' from https://numba.readthedocs.io/en/stable/objects.inv... WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'https://dmlc.github.io/dlpack/latest/objects.inv' not fetchable due to : HTTPSConnectionPool(host='dmlc.github.io', port=443): Max retries exceeded with url: /dlpack/latest/objects.inv (Caused by ProxyError('Unable to connect to proxy', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))) WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'https://numba.readthedocs.io/en/stable/objects.inv' not fetchable due to : HTTPSConnectionPool(host='numba.readthedocs.io', port=443): Max retries exceeded with url: /en/stable/objects.inv (Caused by ProxyError('Unable to connect to proxy', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))) building [mo]: targets for 0 po files that are out of date writing output... building [man]: all manpages updating environment: 0 added, 0 changed, 0 removed reading sources... looking for now-outdated files... none found writing... mpi4py.3 { intro overview tutorial mpi4py mpi4py.MPI mpi4py.typing mpi4py.futures mpi4py.util mpi4py.util.dtlib mpi4py.util.pkl5 mpi4py.util.pool mpi4py.util.sync mpi4py.run mpi4py.bench reference reference/mpi4py.MPI reference/mpi4py.MPI.BottomType reference/mpi4py.MPI.BufferAutomaticType reference/mpi4py.MPI.Cartcomm reference/mpi4py.MPI.Comm reference/mpi4py.MPI.Datatype reference/mpi4py.MPI.Distgraphcomm reference/mpi4py.MPI.Errhandler reference/mpi4py.MPI.File reference/mpi4py.MPI.Graphcomm reference/mpi4py.MPI.Grequest reference/mpi4py.MPI.Group reference/mpi4py.MPI.InPlaceType reference/mpi4py.MPI.Info reference/mpi4py.MPI.Intercomm reference/mpi4py.MPI.Intracomm reference/mpi4py.MPI.Message reference/mpi4py.MPI.Op reference/mpi4py.MPI.Pickle reference/mpi4py.MPI.Prequest reference/mpi4py.MPI.Request reference/mpi4py.MPI.Session reference/mpi4py.MPI.Status reference/mpi4py.MPI.Topocomm reference/mpi4py.MPI.Win reference/mpi4py.MPI.buffer reference/mpi4py.MPI.memory reference/mpi4py.MPI.Exception reference/mpi4py.MPI.Add_error_class reference/mpi4py.MPI.Add_error_code reference/mpi4py.MPI.Add_error_string reference/mpi4py.MPI.Aint_add reference/mpi4py.MPI.Aint_diff reference/mpi4py.MPI.Alloc_mem reference/mpi4py.MPI.Attach_buffer reference/mpi4py.MPI.Close_port reference/mpi4py.MPI.Compute_dims reference/mpi4py.MPI.Detach_buffer reference/mpi4py.MPI.Finalize reference/mpi4py.MPI.Flush_buffer reference/mpi4py.MPI.Free_mem reference/mpi4py.MPI.Get_address reference/mpi4py.MPI.Get_error_class reference/mpi4py.MPI.Get_error_string reference/mpi4py.MPI.Get_hw_resource_info reference/mpi4py.MPI.Get_library_version reference/mpi4py.MPI.Get_processor_name reference/mpi4py.MPI.Get_version reference/mpi4py.MPI.Iflush_buffer reference/mpi4py.MPI.Init reference/mpi4py.MPI.Init_thread reference/mpi4py.MPI.Is_finalized reference/mpi4py.MPI.Is_initialized reference/mpi4py.MPI.Is_thread_main reference/mpi4py.MPI.Lookup_name reference/mpi4py.MPI.Open_port reference/mpi4py.MPI.Pcontrol reference/mpi4py.MPI.Publish_name reference/mpi4py.MPI.Query_thread reference/mpi4py.MPI.Register_datarep reference/mpi4py.MPI.Remove_error_class reference/mpi4py.MPI.Remove_error_code reference/mpi4py.MPI.Remove_error_string reference/mpi4py.MPI.Unpublish_name reference/mpi4py.MPI.Wtick reference/mpi4py.MPI.Wtime reference/mpi4py.MPI.get_vendor reference/mpi4py.MPI.UNDEFINED reference/mpi4py.MPI.ANY_SOURCE reference/mpi4py.MPI.ANY_TAG reference/mpi4py.MPI.PROC_NULL reference/mpi4py.MPI.ROOT reference/mpi4py.MPI.BOTTOM reference/mpi4py.MPI.IN_PLACE reference/mpi4py.MPI.KEYVAL_INVALID reference/mpi4py.MPI.TAG_UB reference/mpi4py.MPI.IO reference/mpi4py.MPI.WTIME_IS_GLOBAL reference/mpi4py.MPI.UNIVERSE_SIZE reference/mpi4py.MPI.APPNUM reference/mpi4py.MPI.LASTUSEDCODE reference/mpi4py.MPI.WIN_BASE reference/mpi4py.MPI.WIN_SIZE reference/mpi4py.MPI.WIN_DISP_UNIT reference/mpi4py.MPI.WIN_CREATE_FLAVOR reference/mpi4py.MPI.WIN_FLAVOR reference/mpi4py.MPI.WIN_MODEL reference/mpi4py.MPI.SUCCESS reference/mpi4py.MPI.ERR_LASTCODE reference/mpi4py.MPI.ERR_TYPE reference/mpi4py.MPI.ERR_REQUEST reference/mpi4py.MPI.ERR_OP reference/mpi4py.MPI.ERR_GROUP reference/mpi4py.MPI.ERR_INFO reference/mpi4py.MPI.ERR_ERRHANDLER reference/mpi4py.MPI.ERR_SESSION reference/mpi4py.MPI.ERR_COMM reference/mpi4py.MPI.ERR_WIN reference/mpi4py.MPI.ERR_FILE reference/mpi4py.MPI.ERR_BUFFER reference/mpi4py.MPI.ERR_COUNT reference/mpi4py.MPI.ERR_TAG reference/mpi4py.MPI.ERR_RANK reference/mpi4py.MPI.ERR_ROOT reference/mpi4py.MPI.ERR_TRUNCATE reference/mpi4py.MPI.ERR_IN_STATUS reference/mpi4py.MPI.ERR_PENDING reference/mpi4py.MPI.ERR_TOPOLOGY reference/mpi4py.MPI.ERR_DIMS reference/mpi4py.MPI.ERR_ARG reference/mpi4py.MPI.ERR_OTHER reference/mpi4py.MPI.ERR_UNKNOWN reference/mpi4py.MPI.ERR_INTERN reference/mpi4py.MPI.ERR_KEYVAL reference/mpi4py.MPI.ERR_NO_MEM reference/mpi4py.MPI.ERR_INFO_KEY reference/mpi4py.MPI.ERR_INFO_VALUE reference/mpi4py.MPI.ERR_INFO_NOKEY reference/mpi4py.MPI.ERR_SPAWN reference/mpi4py.MPI.ERR_PORT reference/mpi4py.MPI.ERR_SERVICE reference/mpi4py.MPI.ERR_NAME reference/mpi4py.MPI.ERR_PROC_ABORTED reference/mpi4py.MPI.ERR_BASE reference/mpi4py.MPI.ERR_SIZE reference/mpi4py.MPI.ERR_DISP reference/mpi4py.MPI.ERR_ASSERT reference/mpi4py.MPI.ERR_LOCKTYPE reference/mpi4py.MPI.ERR_RMA_CONFLICT reference/mpi4py.MPI.ERR_RMA_SYNC reference/mpi4py.MPI.ERR_RMA_RANGE reference/mpi4py.MPI.ERR_RMA_ATTACH reference/mpi4py.MPI.ERR_RMA_SHARED reference/mpi4py.MPI.ERR_RMA_FLAVOR reference/mpi4py.MPI.ERR_BAD_FILE reference/mpi4py.MPI.ERR_NO_SUCH_FILE reference/mpi4py.MPI.ERR_FILE_EXISTS reference/mpi4py.MPI.ERR_FILE_IN_USE reference/mpi4py.MPI.ERR_AMODE reference/mpi4py.MPI.ERR_ACCESS reference/mpi4py.MPI.ERR_READ_ONLY reference/mpi4py.MPI.ERR_NO_SPACE reference/mpi4py.MPI.ERR_QUOTA reference/mpi4py.MPI.ERR_NOT_SAME reference/mpi4py.MPI.ERR_IO reference/mpi4py.MPI.ERR_UNSUPPORTED_OPERATION reference/mpi4py.MPI.ERR_UNSUPPORTED_DATAREP reference/mpi4py.MPI.ERR_CONVERSION reference/mpi4py.MPI.ERR_DUP_DATAREP reference/mpi4py.MPI.ERR_VALUE_TOO_LARGE reference/mpi4py.MPI.ERR_REVOKED reference/mpi4py.MPI.ERR_PROC_FAILED reference/mpi4py.MPI.ERR_PROC_FAILED_PENDING reference/mpi4py.MPI.ORDER_C reference/mpi4py.MPI.ORDER_FORTRAN reference/mpi4py.MPI.ORDER_F reference/mpi4py.MPI.TYPECLASS_INTEGER reference/mpi4py.MPI.TYPECLASS_REAL reference/mpi4py.MPI.TYPECLASS_COMPLEX reference/mpi4py.MPI.DISTRIBUTE_NONE reference/mpi4py.MPI.DISTRIBUTE_BLOCK reference/mpi4py.MPI.DISTRIBUTE_CYCLIC reference/mpi4py.MPI.DISTRIBUTE_DFLT_DARG reference/mpi4py.MPI.COMBINER_NAMED reference/mpi4py.MPI.COMBINER_DUP reference/mpi4py.MPI.COMBINER_CONTIGUOUS reference/mpi4py.MPI.COMBINER_VECTOR reference/mpi4py.MPI.COMBINER_HVECTOR reference/mpi4py.MPI.COMBINER_INDEXED reference/mpi4py.MPI.COMBINER_HINDEXED reference/mpi4py.MPI.COMBINER_INDEXED_BLOCK reference/mpi4py.MPI.COMBINER_HINDEXED_BLOCK reference/mpi4py.MPI.COMBINER_STRUCT reference/mpi4py.MPI.COMBINER_SUBARRAY reference/mpi4py.MPI.COMBINER_DARRAY reference/mpi4py.MPI.COMBINER_RESIZED reference/mpi4py.MPI.COMBINER_VALUE_INDEX reference/mpi4py.MPI.COMBINER_F90_INTEGER reference/mpi4py.MPI.COMBINER_F90_REAL reference/mpi4py.MPI.COMBINER_F90_COMPLEX reference/mpi4py.MPI.F_SOURCE reference/mpi4py.MPI.F_TAG reference/mpi4py.MPI.F_ERROR reference/mpi4py.MPI.F_STATUS_SIZE reference/mpi4py.MPI.IDENT reference/mpi4py.MPI.CONGRUENT reference/mpi4py.MPI.SIMILAR reference/mpi4py.MPI.UNEQUAL reference/mpi4py.MPI.CART reference/mpi4py.MPI.GRAPH reference/mpi4py.MPI.DIST_GRAPH reference/mpi4py.MPI.UNWEIGHTED reference/mpi4py.MPI.WEIGHTS_EMPTY reference/mpi4py.MPI.COMM_TYPE_SHARED reference/mpi4py.MPI.COMM_TYPE_HW_GUIDED reference/mpi4py.MPI.COMM_TYPE_HW_UNGUIDED reference/mpi4py.MPI.COMM_TYPE_RESOURCE_GUIDED reference/mpi4py.MPI.BSEND_OVERHEAD reference/mpi4py.MPI.BUFFER_AUTOMATIC reference/mpi4py.MPI.WIN_FLAVOR_CREATE reference/mpi4py.MPI.WIN_FLAVOR_ALLOCATE reference/mpi4py.MPI.WIN_FLAVOR_DYNAMIC reference/mpi4py.MPI.WIN_FLAVOR_SHARED reference/mpi4py.MPI.WIN_SEPARATE reference/mpi4py.MPI.WIN_UNIFIED reference/mpi4py.MPI.MODE_NOCHECK reference/mpi4py.MPI.MODE_NOSTORE reference/mpi4py.MPI.MODE_NOPUT reference/mpi4py.MPI.MODE_NOPRECEDE reference/mpi4py.MPI.MODE_NOSUCCEED reference/mpi4py.MPI.LOCK_EXCLUSIVE reference/mpi4py.MPI.LOCK_SHARED reference/mpi4py.MPI.MODE_RDONLY reference/mpi4py.MPI.MODE_WRONLY reference/mpi4py.MPI.MODE_RDWR reference/mpi4py.MPI.MODE_CREATE reference/mpi4py.MPI.MODE_EXCL reference/mpi4py.MPI.MODE_DELETE_ON_CLOSE reference/mpi4py.MPI.MODE_UNIQUE_OPEN reference/mpi4py.MPI.MODE_SEQUENTIAL reference/mpi4py.MPI.MODE_APPEND reference/mpi4py.MPI.SEEK_SET reference/mpi4py.MPI.SEEK_CUR reference/mpi4py.MPI.SEEK_END reference/mpi4py.MPI.DISPLACEMENT_CURRENT reference/mpi4py.MPI.DISP_CUR reference/mpi4py.MPI.THREAD_SINGLE reference/mpi4py.MPI.THREAD_FUNNELED reference/mpi4py.MPI.THREAD_SERIALIZED reference/mpi4py.MPI.THREAD_MULTIPLE reference/mpi4py.MPI.VERSION reference/mpi4py.MPI.SUBVERSION reference/mpi4py.MPI.MAX_PROCESSOR_NAME reference/mpi4py.MPI.MAX_ERROR_STRING reference/mpi4py.MPI.MAX_PORT_NAME reference/mpi4py.MPI.MAX_INFO_KEY reference/mpi4py.MPI.MAX_INFO_VAL reference/mpi4py.MPI.MAX_OBJECT_NAME reference/mpi4py.MPI.MAX_DATAREP_STRING reference/mpi4py.MPI.MAX_LIBRARY_VERSION_STRING reference/mpi4py.MPI.MAX_PSET_NAME_LEN reference/mpi4py.MPI.MAX_STRINGTAG_LEN reference/mpi4py.MPI.DATATYPE_NULL reference/mpi4py.MPI.PACKED reference/mpi4py.MPI.BYTE reference/mpi4py.MPI.AINT reference/mpi4py.MPI.OFFSET reference/mpi4py.MPI.COUNT reference/mpi4py.MPI.CHAR reference/mpi4py.MPI.WCHAR reference/mpi4py.MPI.SIGNED_CHAR reference/mpi4py.MPI.SHORT reference/mpi4py.MPI.INT reference/mpi4py.MPI.LONG reference/mpi4py.MPI.LONG_LONG reference/mpi4py.MPI.UNSIGNED_CHAR reference/mpi4py.MPI.UNSIGNED_SHORT reference/mpi4py.MPI.UNSIGNED reference/mpi4py.MPI.UNSIGNED_LONG reference/mpi4py.MPI.UNSIGNED_LONG_LONG reference/mpi4py.MPI.FLOAT reference/mpi4py.MPI.DOUBLE reference/mpi4py.MPI.LONG_DOUBLE reference/mpi4py.MPI.C_BOOL reference/mpi4py.MPI.INT8_T reference/mpi4py.MPI.INT16_T reference/mpi4py.MPI.INT32_T reference/mpi4py.MPI.INT64_T reference/mpi4py.MPI.UINT8_T reference/mpi4py.MPI.UINT16_T reference/mpi4py.MPI.UINT32_T reference/mpi4py.MPI.UINT64_T reference/mpi4py.MPI.C_COMPLEX reference/mpi4py.MPI.C_FLOAT_COMPLEX reference/mpi4py.MPI.C_DOUBLE_COMPLEX reference/mpi4py.MPI.C_LONG_DOUBLE_COMPLEX reference/mpi4py.MPI.CXX_BOOL reference/mpi4py.MPI.CXX_FLOAT_COMPLEX reference/mpi4py.MPI.CXX_DOUBLE_COMPLEX reference/mpi4py.MPI.CXX_LONG_DOUBLE_COMPLEX reference/mpi4py.MPI.SHORT_INT reference/mpi4py.MPI.INT_INT reference/mpi4py.MPI.TWOINT reference/mpi4py.MPI.LONG_INT reference/mpi4py.MPI.FLOAT_INT reference/mpi4py.MPI.DOUBLE_INT reference/mpi4py.MPI.LONG_DOUBLE_INT reference/mpi4py.MPI.CHARACTER reference/mpi4py.MPI.LOGICAL reference/mpi4py.MPI.INTEGER reference/mpi4py.MPI.REAL reference/mpi4py.MPI.DOUBLE_PRECISION reference/mpi4py.MPI.COMPLEX reference/mpi4py.MPI.DOUBLE_COMPLEX reference/mpi4py.MPI.LOGICAL1 reference/mpi4py.MPI.LOGICAL2 reference/mpi4py.MPI.LOGICAL4 reference/mpi4py.MPI.LOGICAL8 reference/mpi4py.MPI.INTEGER1 reference/mpi4py.MPI.INTEGER2 reference/mpi4py.MPI.INTEGER4 reference/mpi4py.MPI.INTEGER8 reference/mpi4py.MPI.INTEGER16 reference/mpi4py.MPI.REAL2 reference/mpi4py.MPI.REAL4 reference/mpi4py.MPI.REAL8 reference/mpi4py.MPI.REAL16 reference/mpi4py.MPI.COMPLEX4 reference/mpi4py.MPI.COMPLEX8 reference/mpi4py.MPI.COMPLEX16 reference/mpi4py.MPI.COMPLEX32 reference/mpi4py.MPI.UNSIGNED_INT reference/mpi4py.MPI.SIGNED_SHORT reference/mpi4py.MPI.SIGNED_INT reference/mpi4py.MPI.SIGNED_LONG reference/mpi4py.MPI.SIGNED_LONG_LONG reference/mpi4py.MPI.BOOL reference/mpi4py.MPI.SINT8_T reference/mpi4py.MPI.SINT16_T reference/mpi4py.MPI.SINT32_T reference/mpi4py.MPI.SINT64_T reference/mpi4py.MPI.F_BOOL reference/mpi4py.MPI.F_INT reference/mpi4py.MPI.F_FLOAT reference/mpi4py.MPI.F_DOUBLE reference/mpi4py.MPI.F_COMPLEX reference/mpi4py.MPI.F_FLOAT_COMPLEX reference/mpi4py.MPI.F_DOUBLE_COMPLEX reference/mpi4py.MPI.REQUEST_NULL reference/mpi4py.MPI.MESSAGE_NULL reference/mpi4py.MPI.MESSAGE_NO_PROC reference/mpi4py.MPI.OP_NULL reference/mpi4py.MPI.MAX reference/mpi4py.MPI.MIN reference/mpi4py.MPI.SUM reference/mpi4py.MPI.PROD reference/mpi4py.MPI.LAND reference/mpi4py.MPI.BAND reference/mpi4py.MPI.LOR reference/mpi4py.MPI.BOR reference/mpi4py.MPI.LXOR reference/mpi4py.MPI.BXOR reference/mpi4py.MPI.MAXLOC reference/mpi4py.MPI.MINLOC reference/mpi4py.MPI.REPLACE reference/mpi4py.MPI.NO_OP reference/mpi4py.MPI.GROUP_NULL reference/mpi4py.MPI.GROUP_EMPTY reference/mpi4py.MPI.INFO_NULL reference/mpi4py.MPI.INFO_ENV reference/mpi4py.MPI.ERRHANDLER_NULL reference/mpi4py.MPI.ERRORS_RETURN reference/mpi4py.MPI.ERRORS_ABORT reference/mpi4py.MPI.ERRORS_ARE_FATAL reference/mpi4py.MPI.SESSION_NULL reference/mpi4py.MPI.COMM_NULL reference/mpi4py.MPI.COMM_SELF reference/mpi4py.MPI.COMM_WORLD reference/mpi4py.MPI.WIN_NULL reference/mpi4py.MPI.FILE_NULL reference/mpi4py.MPI.pickle citation install develop guidelines license changes } /build/reproducible-path/mpi4py-4.0.0/docs/source/overview.rst:219: WARNING: 'any' reference target not found: dlpack:python-spec /build/reproducible-path/mpi4py-4.0.0/docs/source/overview.rst:219: WARNING: 'any' reference target not found: numba:cuda-array-interface /build/reproducible-path/mpi4py-4.0.0/docs/source/docstring of mpi4py.typing.SupportsDLPack:3: WARNING: undefined label: 'dlpack:python-spec' /build/reproducible-path/mpi4py-4.0.0/docs/source/docstring of mpi4py.typing.SupportsCAI:3: WARNING: undefined label: 'numba:cuda-array-interface' done build succeeded, 6 warnings. The manual pages are in _build/man. Running Sphinx v7.4.7 loading translations [en]... done making output directory... done loading pickled environment... done [autosummary] generating autosummary for: changes.rst, citation.rst, develop.rst, guidelines.rst, index.rst, install.rst, intro.rst, license.rst, mpi4py.MPI.rst, mpi4py.bench.rst, ..., reference/mpi4py.MPI.WTIME_IS_GLOBAL.rst, reference/mpi4py.MPI.Win.rst, reference/mpi4py.MPI.Wtick.rst, reference/mpi4py.MPI.Wtime.rst, reference/mpi4py.MPI.buffer.rst, reference/mpi4py.MPI.get_vendor.rst, reference/mpi4py.MPI.memory.rst, reference/mpi4py.MPI.pickle.rst, reference/mpi4py.MPI.rst, tutorial.rst loading intersphinx inventory 'python' from /usr/share/doc/python3/html/objects.inv... loading intersphinx inventory 'numpy' from /usr/share/doc/python-numpy/html/objects.inv... loading intersphinx inventory 'dlpack' from https://dmlc.github.io/dlpack/latest/objects.inv... loading intersphinx inventory 'numba' from https://numba.readthedocs.io/en/stable/objects.inv... WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'https://numba.readthedocs.io/en/stable/objects.inv' not fetchable due to : HTTPSConnectionPool(host='numba.readthedocs.io', port=443): Max retries exceeded with url: /en/stable/objects.inv (Caused by ProxyError('Unable to connect to proxy', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))) WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'https://dmlc.github.io/dlpack/latest/objects.inv' not fetchable due to : HTTPSConnectionPool(host='dmlc.github.io', port=443): Max retries exceeded with url: /dlpack/latest/objects.inv (Caused by ProxyError('Unable to connect to proxy', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))) building [mo]: targets for 0 po files that are out of date writing output... building [texinfo]: all documents updating environment: 0 added, 0 changed, 0 removed reading sources... looking for now-outdated files... none found copying Texinfo support files... Makefile done processing mpi4py.texi... index intro overview tutorial mpi4py mpi4py.MPI mpi4py.typing mpi4py.futures mpi4py.util mpi4py.util.dtlib mpi4py.util.pkl5 mpi4py.util.pool mpi4py.util.sync mpi4py.run mpi4py.bench reference reference/mpi4py.MPI reference/mpi4py.MPI.BottomType reference/mpi4py.MPI.BufferAutomaticType reference/mpi4py.MPI.Cartcomm reference/mpi4py.MPI.Comm reference/mpi4py.MPI.Datatype reference/mpi4py.MPI.Distgraphcomm reference/mpi4py.MPI.Errhandler reference/mpi4py.MPI.File reference/mpi4py.MPI.Graphcomm reference/mpi4py.MPI.Grequest reference/mpi4py.MPI.Group reference/mpi4py.MPI.InPlaceType reference/mpi4py.MPI.Info reference/mpi4py.MPI.Intercomm reference/mpi4py.MPI.Intracomm reference/mpi4py.MPI.Message reference/mpi4py.MPI.Op reference/mpi4py.MPI.Pickle reference/mpi4py.MPI.Prequest reference/mpi4py.MPI.Request reference/mpi4py.MPI.Session reference/mpi4py.MPI.Status reference/mpi4py.MPI.Topocomm reference/mpi4py.MPI.Win reference/mpi4py.MPI.buffer reference/mpi4py.MPI.memory reference/mpi4py.MPI.Exception reference/mpi4py.MPI.Add_error_class reference/mpi4py.MPI.Add_error_code reference/mpi4py.MPI.Add_error_string reference/mpi4py.MPI.Aint_add reference/mpi4py.MPI.Aint_diff reference/mpi4py.MPI.Alloc_mem reference/mpi4py.MPI.Attach_buffer reference/mpi4py.MPI.Close_port reference/mpi4py.MPI.Compute_dims reference/mpi4py.MPI.Detach_buffer reference/mpi4py.MPI.Finalize reference/mpi4py.MPI.Flush_buffer reference/mpi4py.MPI.Free_mem reference/mpi4py.MPI.Get_address reference/mpi4py.MPI.Get_error_class reference/mpi4py.MPI.Get_error_string reference/mpi4py.MPI.Get_hw_resource_info reference/mpi4py.MPI.Get_library_version reference/mpi4py.MPI.Get_processor_name reference/mpi4py.MPI.Get_version reference/mpi4py.MPI.Iflush_buffer reference/mpi4py.MPI.Init reference/mpi4py.MPI.Init_thread reference/mpi4py.MPI.Is_finalized reference/mpi4py.MPI.Is_initialized reference/mpi4py.MPI.Is_thread_main reference/mpi4py.MPI.Lookup_name reference/mpi4py.MPI.Open_port reference/mpi4py.MPI.Pcontrol reference/mpi4py.MPI.Publish_name reference/mpi4py.MPI.Query_thread reference/mpi4py.MPI.Register_datarep reference/mpi4py.MPI.Remove_error_class reference/mpi4py.MPI.Remove_error_code reference/mpi4py.MPI.Remove_error_string reference/mpi4py.MPI.Unpublish_name reference/mpi4py.MPI.Wtick reference/mpi4py.MPI.Wtime reference/mpi4py.MPI.get_vendor reference/mpi4py.MPI.UNDEFINED reference/mpi4py.MPI.ANY_SOURCE reference/mpi4py.MPI.ANY_TAG reference/mpi4py.MPI.PROC_NULL reference/mpi4py.MPI.ROOT reference/mpi4py.MPI.BOTTOM reference/mpi4py.MPI.IN_PLACE reference/mpi4py.MPI.KEYVAL_INVALID reference/mpi4py.MPI.TAG_UB reference/mpi4py.MPI.IO reference/mpi4py.MPI.WTIME_IS_GLOBAL reference/mpi4py.MPI.UNIVERSE_SIZE reference/mpi4py.MPI.APPNUM reference/mpi4py.MPI.LASTUSEDCODE reference/mpi4py.MPI.WIN_BASE reference/mpi4py.MPI.WIN_SIZE reference/mpi4py.MPI.WIN_DISP_UNIT reference/mpi4py.MPI.WIN_CREATE_FLAVOR reference/mpi4py.MPI.WIN_FLAVOR reference/mpi4py.MPI.WIN_MODEL reference/mpi4py.MPI.SUCCESS reference/mpi4py.MPI.ERR_LASTCODE reference/mpi4py.MPI.ERR_TYPE reference/mpi4py.MPI.ERR_REQUEST reference/mpi4py.MPI.ERR_OP reference/mpi4py.MPI.ERR_GROUP reference/mpi4py.MPI.ERR_INFO reference/mpi4py.MPI.ERR_ERRHANDLER reference/mpi4py.MPI.ERR_SESSION reference/mpi4py.MPI.ERR_COMM reference/mpi4py.MPI.ERR_WIN reference/mpi4py.MPI.ERR_FILE reference/mpi4py.MPI.ERR_BUFFER reference/mpi4py.MPI.ERR_COUNT reference/mpi4py.MPI.ERR_TAG reference/mpi4py.MPI.ERR_RANK reference/mpi4py.MPI.ERR_ROOT reference/mpi4py.MPI.ERR_TRUNCATE reference/mpi4py.MPI.ERR_IN_STATUS reference/mpi4py.MPI.ERR_PENDING reference/mpi4py.MPI.ERR_TOPOLOGY reference/mpi4py.MPI.ERR_DIMS reference/mpi4py.MPI.ERR_ARG reference/mpi4py.MPI.ERR_OTHER reference/mpi4py.MPI.ERR_UNKNOWN reference/mpi4py.MPI.ERR_INTERN reference/mpi4py.MPI.ERR_KEYVAL reference/mpi4py.MPI.ERR_NO_MEM reference/mpi4py.MPI.ERR_INFO_KEY reference/mpi4py.MPI.ERR_INFO_VALUE reference/mpi4py.MPI.ERR_INFO_NOKEY reference/mpi4py.MPI.ERR_SPAWN reference/mpi4py.MPI.ERR_PORT reference/mpi4py.MPI.ERR_SERVICE reference/mpi4py.MPI.ERR_NAME reference/mpi4py.MPI.ERR_PROC_ABORTED reference/mpi4py.MPI.ERR_BASE reference/mpi4py.MPI.ERR_SIZE reference/mpi4py.MPI.ERR_DISP reference/mpi4py.MPI.ERR_ASSERT reference/mpi4py.MPI.ERR_LOCKTYPE reference/mpi4py.MPI.ERR_RMA_CONFLICT reference/mpi4py.MPI.ERR_RMA_SYNC reference/mpi4py.MPI.ERR_RMA_RANGE reference/mpi4py.MPI.ERR_RMA_ATTACH reference/mpi4py.MPI.ERR_RMA_SHARED reference/mpi4py.MPI.ERR_RMA_FLAVOR reference/mpi4py.MPI.ERR_BAD_FILE reference/mpi4py.MPI.ERR_NO_SUCH_FILE reference/mpi4py.MPI.ERR_FILE_EXISTS reference/mpi4py.MPI.ERR_FILE_IN_USE reference/mpi4py.MPI.ERR_AMODE reference/mpi4py.MPI.ERR_ACCESS reference/mpi4py.MPI.ERR_READ_ONLY reference/mpi4py.MPI.ERR_NO_SPACE reference/mpi4py.MPI.ERR_QUOTA reference/mpi4py.MPI.ERR_NOT_SAME reference/mpi4py.MPI.ERR_IO reference/mpi4py.MPI.ERR_UNSUPPORTED_OPERATION reference/mpi4py.MPI.ERR_UNSUPPORTED_DATAREP reference/mpi4py.MPI.ERR_CONVERSION reference/mpi4py.MPI.ERR_DUP_DATAREP reference/mpi4py.MPI.ERR_VALUE_TOO_LARGE reference/mpi4py.MPI.ERR_REVOKED reference/mpi4py.MPI.ERR_PROC_FAILED reference/mpi4py.MPI.ERR_PROC_FAILED_PENDING reference/mpi4py.MPI.ORDER_C reference/mpi4py.MPI.ORDER_FORTRAN reference/mpi4py.MPI.ORDER_F reference/mpi4py.MPI.TYPECLASS_INTEGER reference/mpi4py.MPI.TYPECLASS_REAL reference/mpi4py.MPI.TYPECLASS_COMPLEX reference/mpi4py.MPI.DISTRIBUTE_NONE reference/mpi4py.MPI.DISTRIBUTE_BLOCK reference/mpi4py.MPI.DISTRIBUTE_CYCLIC reference/mpi4py.MPI.DISTRIBUTE_DFLT_DARG reference/mpi4py.MPI.COMBINER_NAMED reference/mpi4py.MPI.COMBINER_DUP reference/mpi4py.MPI.COMBINER_CONTIGUOUS reference/mpi4py.MPI.COMBINER_VECTOR reference/mpi4py.MPI.COMBINER_HVECTOR reference/mpi4py.MPI.COMBINER_INDEXED reference/mpi4py.MPI.COMBINER_HINDEXED reference/mpi4py.MPI.COMBINER_INDEXED_BLOCK reference/mpi4py.MPI.COMBINER_HINDEXED_BLOCK reference/mpi4py.MPI.COMBINER_STRUCT reference/mpi4py.MPI.COMBINER_SUBARRAY reference/mpi4py.MPI.COMBINER_DARRAY reference/mpi4py.MPI.COMBINER_RESIZED reference/mpi4py.MPI.COMBINER_VALUE_INDEX reference/mpi4py.MPI.COMBINER_F90_INTEGER reference/mpi4py.MPI.COMBINER_F90_REAL reference/mpi4py.MPI.COMBINER_F90_COMPLEX reference/mpi4py.MPI.F_SOURCE reference/mpi4py.MPI.F_TAG reference/mpi4py.MPI.F_ERROR reference/mpi4py.MPI.F_STATUS_SIZE reference/mpi4py.MPI.IDENT reference/mpi4py.MPI.CONGRUENT reference/mpi4py.MPI.SIMILAR reference/mpi4py.MPI.UNEQUAL reference/mpi4py.MPI.CART reference/mpi4py.MPI.GRAPH reference/mpi4py.MPI.DIST_GRAPH reference/mpi4py.MPI.UNWEIGHTED reference/mpi4py.MPI.WEIGHTS_EMPTY reference/mpi4py.MPI.COMM_TYPE_SHARED reference/mpi4py.MPI.COMM_TYPE_HW_GUIDED reference/mpi4py.MPI.COMM_TYPE_HW_UNGUIDED reference/mpi4py.MPI.COMM_TYPE_RESOURCE_GUIDED reference/mpi4py.MPI.BSEND_OVERHEAD reference/mpi4py.MPI.BUFFER_AUTOMATIC reference/mpi4py.MPI.WIN_FLAVOR_CREATE reference/mpi4py.MPI.WIN_FLAVOR_ALLOCATE reference/mpi4py.MPI.WIN_FLAVOR_DYNAMIC reference/mpi4py.MPI.WIN_FLAVOR_SHARED reference/mpi4py.MPI.WIN_SEPARATE reference/mpi4py.MPI.WIN_UNIFIED reference/mpi4py.MPI.MODE_NOCHECK reference/mpi4py.MPI.MODE_NOSTORE reference/mpi4py.MPI.MODE_NOPUT reference/mpi4py.MPI.MODE_NOPRECEDE reference/mpi4py.MPI.MODE_NOSUCCEED reference/mpi4py.MPI.LOCK_EXCLUSIVE reference/mpi4py.MPI.LOCK_SHARED reference/mpi4py.MPI.MODE_RDONLY reference/mpi4py.MPI.MODE_WRONLY reference/mpi4py.MPI.MODE_RDWR reference/mpi4py.MPI.MODE_CREATE reference/mpi4py.MPI.MODE_EXCL reference/mpi4py.MPI.MODE_DELETE_ON_CLOSE reference/mpi4py.MPI.MODE_UNIQUE_OPEN reference/mpi4py.MPI.MODE_SEQUENTIAL reference/mpi4py.MPI.MODE_APPEND reference/mpi4py.MPI.SEEK_SET reference/mpi4py.MPI.SEEK_CUR reference/mpi4py.MPI.SEEK_END reference/mpi4py.MPI.DISPLACEMENT_CURRENT reference/mpi4py.MPI.DISP_CUR reference/mpi4py.MPI.THREAD_SINGLE reference/mpi4py.MPI.THREAD_FUNNELED reference/mpi4py.MPI.THREAD_SERIALIZED reference/mpi4py.MPI.THREAD_MULTIPLE reference/mpi4py.MPI.VERSION reference/mpi4py.MPI.SUBVERSION reference/mpi4py.MPI.MAX_PROCESSOR_NAME reference/mpi4py.MPI.MAX_ERROR_STRING reference/mpi4py.MPI.MAX_PORT_NAME reference/mpi4py.MPI.MAX_INFO_KEY reference/mpi4py.MPI.MAX_INFO_VAL reference/mpi4py.MPI.MAX_OBJECT_NAME reference/mpi4py.MPI.MAX_DATAREP_STRING reference/mpi4py.MPI.MAX_LIBRARY_VERSION_STRING reference/mpi4py.MPI.MAX_PSET_NAME_LEN reference/mpi4py.MPI.MAX_STRINGTAG_LEN reference/mpi4py.MPI.DATATYPE_NULL reference/mpi4py.MPI.PACKED reference/mpi4py.MPI.BYTE reference/mpi4py.MPI.AINT reference/mpi4py.MPI.OFFSET reference/mpi4py.MPI.COUNT reference/mpi4py.MPI.CHAR reference/mpi4py.MPI.WCHAR reference/mpi4py.MPI.SIGNED_CHAR reference/mpi4py.MPI.SHORT reference/mpi4py.MPI.INT reference/mpi4py.MPI.LONG reference/mpi4py.MPI.LONG_LONG reference/mpi4py.MPI.UNSIGNED_CHAR reference/mpi4py.MPI.UNSIGNED_SHORT reference/mpi4py.MPI.UNSIGNED reference/mpi4py.MPI.UNSIGNED_LONG reference/mpi4py.MPI.UNSIGNED_LONG_LONG reference/mpi4py.MPI.FLOAT reference/mpi4py.MPI.DOUBLE reference/mpi4py.MPI.LONG_DOUBLE reference/mpi4py.MPI.C_BOOL reference/mpi4py.MPI.INT8_T reference/mpi4py.MPI.INT16_T reference/mpi4py.MPI.INT32_T reference/mpi4py.MPI.INT64_T reference/mpi4py.MPI.UINT8_T reference/mpi4py.MPI.UINT16_T reference/mpi4py.MPI.UINT32_T reference/mpi4py.MPI.UINT64_T reference/mpi4py.MPI.C_COMPLEX reference/mpi4py.MPI.C_FLOAT_COMPLEX reference/mpi4py.MPI.C_DOUBLE_COMPLEX reference/mpi4py.MPI.C_LONG_DOUBLE_COMPLEX reference/mpi4py.MPI.CXX_BOOL reference/mpi4py.MPI.CXX_FLOAT_COMPLEX reference/mpi4py.MPI.CXX_DOUBLE_COMPLEX reference/mpi4py.MPI.CXX_LONG_DOUBLE_COMPLEX reference/mpi4py.MPI.SHORT_INT reference/mpi4py.MPI.INT_INT reference/mpi4py.MPI.TWOINT reference/mpi4py.MPI.LONG_INT reference/mpi4py.MPI.FLOAT_INT reference/mpi4py.MPI.DOUBLE_INT reference/mpi4py.MPI.LONG_DOUBLE_INT reference/mpi4py.MPI.CHARACTER reference/mpi4py.MPI.LOGICAL reference/mpi4py.MPI.INTEGER reference/mpi4py.MPI.REAL reference/mpi4py.MPI.DOUBLE_PRECISION reference/mpi4py.MPI.COMPLEX reference/mpi4py.MPI.DOUBLE_COMPLEX reference/mpi4py.MPI.LOGICAL1 reference/mpi4py.MPI.LOGICAL2 reference/mpi4py.MPI.LOGICAL4 reference/mpi4py.MPI.LOGICAL8 reference/mpi4py.MPI.INTEGER1 reference/mpi4py.MPI.INTEGER2 reference/mpi4py.MPI.INTEGER4 reference/mpi4py.MPI.INTEGER8 reference/mpi4py.MPI.INTEGER16 reference/mpi4py.MPI.REAL2 reference/mpi4py.MPI.REAL4 reference/mpi4py.MPI.REAL8 reference/mpi4py.MPI.REAL16 reference/mpi4py.MPI.COMPLEX4 reference/mpi4py.MPI.COMPLEX8 reference/mpi4py.MPI.COMPLEX16 reference/mpi4py.MPI.COMPLEX32 reference/mpi4py.MPI.UNSIGNED_INT reference/mpi4py.MPI.SIGNED_SHORT reference/mpi4py.MPI.SIGNED_INT reference/mpi4py.MPI.SIGNED_LONG reference/mpi4py.MPI.SIGNED_LONG_LONG reference/mpi4py.MPI.BOOL reference/mpi4py.MPI.SINT8_T reference/mpi4py.MPI.SINT16_T reference/mpi4py.MPI.SINT32_T reference/mpi4py.MPI.SINT64_T reference/mpi4py.MPI.F_BOOL reference/mpi4py.MPI.F_INT reference/mpi4py.MPI.F_FLOAT reference/mpi4py.MPI.F_DOUBLE reference/mpi4py.MPI.F_COMPLEX reference/mpi4py.MPI.F_FLOAT_COMPLEX reference/mpi4py.MPI.F_DOUBLE_COMPLEX reference/mpi4py.MPI.REQUEST_NULL reference/mpi4py.MPI.MESSAGE_NULL reference/mpi4py.MPI.MESSAGE_NO_PROC reference/mpi4py.MPI.OP_NULL reference/mpi4py.MPI.MAX reference/mpi4py.MPI.MIN reference/mpi4py.MPI.SUM reference/mpi4py.MPI.PROD reference/mpi4py.MPI.LAND reference/mpi4py.MPI.BAND reference/mpi4py.MPI.LOR reference/mpi4py.MPI.BOR reference/mpi4py.MPI.LXOR reference/mpi4py.MPI.BXOR reference/mpi4py.MPI.MAXLOC reference/mpi4py.MPI.MINLOC reference/mpi4py.MPI.REPLACE reference/mpi4py.MPI.NO_OP reference/mpi4py.MPI.GROUP_NULL reference/mpi4py.MPI.GROUP_EMPTY reference/mpi4py.MPI.INFO_NULL reference/mpi4py.MPI.INFO_ENV reference/mpi4py.MPI.ERRHANDLER_NULL reference/mpi4py.MPI.ERRORS_RETURN reference/mpi4py.MPI.ERRORS_ABORT reference/mpi4py.MPI.ERRORS_ARE_FATAL reference/mpi4py.MPI.SESSION_NULL reference/mpi4py.MPI.COMM_NULL reference/mpi4py.MPI.COMM_SELF reference/mpi4py.MPI.COMM_WORLD reference/mpi4py.MPI.WIN_NULL reference/mpi4py.MPI.FILE_NULL reference/mpi4py.MPI.pickle citation install develop guidelines license changes resolving references... /build/reproducible-path/mpi4py-4.0.0/docs/source/overview.rst:219: WARNING: 'any' reference target not found: dlpack:python-spec /build/reproducible-path/mpi4py-4.0.0/docs/source/overview.rst:219: WARNING: 'any' reference target not found: numba:cuda-array-interface /build/reproducible-path/mpi4py-4.0.0/docs/source/docstring of mpi4py.typing.SupportsDLPack:3: WARNING: undefined label: 'dlpack:python-spec' /build/reproducible-path/mpi4py-4.0.0/docs/source/docstring of mpi4py.typing.SupportsCAI:3: WARNING: undefined label: 'numba:cuda-array-interface' done writing... done build succeeded, 6 warnings. The Texinfo files are in _build/texinfo. Run 'make' in that directory to run these through makeinfo (use 'make info' here to do that automatically). make[3]: Entering directory '/build/reproducible-path/mpi4py-4.0.0/docs/source/_build/texinfo' makeinfo --no-split -o 'mpi4py.info' 'mpi4py.texi' mpi4py.texi:29771: warning: @footnote should not appear on @deffn line mpi4py.texi:29783: warning: @footnote should not appear on @deffn line mpi4py.texi:29795: warning: @footnote should not appear on @deffn line mpi4py.texi:29807: warning: @footnote should not appear on @deffn line mpi4py.texi:29819: warning: @footnote should not appear on @deffn line mpi4py.texi:29855: warning: @footnote should not appear on @deffn line mpi4py.texi:29867: warning: @footnote should not appear on @deffn line mpi4py.texi:29879: warning: @footnote should not appear on @deffn line mpi4py.texi:29891: warning: @footnote should not appear on @deffn line mpi4py.texi:29903: warning: @footnote should not appear on @deffn line mpi4py.texi:29915: warning: @footnote should not appear on @deffn line mpi4py.texi:29927: warning: @footnote should not appear on @deffn line mpi4py.texi:29939: warning: @footnote should not appear on @deffn line mpi4py.texi:29951: warning: @footnote should not appear on @deffn line mpi4py.texi:29963: warning: @footnote should not appear on @deffn line mpi4py.texi:29975: warning: @footnote should not appear on @deffn line mpi4py.texi:29987: warning: @footnote should not appear on @deffn line mpi4py.texi:29999: warning: @footnote should not appear on @deffn line mpi4py.texi:30011: warning: @footnote should not appear on @deffn line mpi4py.texi:30023: warning: @footnote should not appear on @deffn line mpi4py.texi:30035: warning: @footnote should not appear on @deffn line mpi4py.texi:30047: warning: @footnote should not appear on @deffn line mpi4py.texi:30059: warning: @footnote should not appear on @deffn line mpi4py.texi:30071: warning: @footnote should not appear on @deffn line mpi4py.texi:30083: warning: @footnote should not appear on @deffn line mpi4py.texi:30095: warning: @footnote should not appear on @deffn line mpi4py.texi:30107: warning: @footnote should not appear on @deffn line mpi4py.texi:30119: warning: @footnote should not appear on @deffn line mpi4py.texi:30131: warning: @footnote should not appear on @deffn line mpi4py.texi:30143: warning: @footnote should not appear on @deffn line mpi4py.texi:30155: warning: @footnote should not appear on @deffn line mpi4py.texi:30167: warning: @footnote should not appear on @deffn line mpi4py.texi:30179: warning: @footnote should not appear on @deffn line mpi4py.texi:30191: warning: @footnote should not appear on @deffn line mpi4py.texi:30203: warning: @footnote should not appear on @deffn line mpi4py.texi:30215: warning: @footnote should not appear on @deffn line mpi4py.texi:30227: warning: @footnote should not appear on @deffn line mpi4py.texi:30239: warning: @footnote should not appear on @deffn line mpi4py.texi:30251: warning: @footnote should not appear on @deffn line mpi4py.texi:30263: warning: @footnote should not appear on @deffn line mpi4py.texi:30275: warning: @footnote should not appear on @deffn line mpi4py.texi:30287: warning: @footnote should not appear on @deffn line mpi4py.texi:30299: warning: @footnote should not appear on @deffn line mpi4py.texi:30311: warning: @footnote should not appear on @deffn line mpi4py.texi:30323: warning: @footnote should not appear on @deffn line mpi4py.texi:30335: warning: @footnote should not appear on @deffn line mpi4py.texi:30347: warning: @footnote should not appear on @deffn line mpi4py.texi:30359: warning: @footnote should not appear on @deffn line mpi4py.texi:30371: warning: @footnote should not appear on @deffn line mpi4py.texi:30383: warning: @footnote should not appear on @deffn line mpi4py.texi:30395: warning: @footnote should not appear on @deffn line mpi4py.texi:30407: warning: @footnote should not appear on @deffn line mpi4py.texi:30419: warning: @footnote should not appear on @deffn line mpi4py.texi:30431: warning: @footnote should not appear on @deffn line mpi4py.texi:30443: warning: @footnote should not appear on @deffn line mpi4py.texi:30455: warning: @footnote should not appear on @deffn line mpi4py.texi:30467: warning: @footnote should not appear on @deffn line mpi4py.texi:30479: warning: @footnote should not appear on @deffn line mpi4py.texi:30491: warning: @footnote should not appear on @deffn line mpi4py.texi:30503: warning: @footnote should not appear on @deffn line mpi4py.texi:30515: warning: @footnote should not appear on @deffn line mpi4py.texi:30527: warning: @footnote should not appear on @deffn line mpi4py.texi:30539: warning: @footnote should not appear on @deffn line mpi4py.texi:30551: warning: @footnote should not appear on @deffn line mpi4py.texi:30563: warning: @footnote should not appear on @deffn line mpi4py.texi:30575: warning: @footnote should not appear on @deffn line mpi4py.texi:30587: warning: @footnote should not appear on @deffn line mpi4py.texi:30599: warning: @footnote should not appear on @deffn line mpi4py.texi:30611: warning: @footnote should not appear on @deffn line mpi4py.texi:30623: warning: @footnote should not appear on @deffn line mpi4py.texi:30635: warning: @footnote should not appear on @deffn line mpi4py.texi:30647: warning: @footnote should not appear on @deffn line mpi4py.texi:30659: warning: @footnote should not appear on @deffn line mpi4py.texi:30671: warning: @footnote should not appear on @deffn line mpi4py.texi:30683: warning: @footnote should not appear on @deffn line mpi4py.texi:30695: warning: @footnote should not appear on @deffn line mpi4py.texi:30707: warning: @footnote should not appear on @deffn line mpi4py.texi:30719: warning: @footnote should not appear on @deffn line mpi4py.texi:30731: warning: @footnote should not appear on @deffn line mpi4py.texi:30743: warning: @footnote should not appear on @deffn line mpi4py.texi:30755: warning: @footnote should not appear on @deffn line mpi4py.texi:30767: warning: @footnote should not appear on @deffn line mpi4py.texi:30779: warning: @footnote should not appear on @deffn line mpi4py.texi:30791: warning: @footnote should not appear on @deffn line mpi4py.texi:30803: warning: @footnote should not appear on @deffn line mpi4py.texi:30815: warning: @footnote should not appear on @deffn line mpi4py.texi:30827: warning: @footnote should not appear on @deffn line mpi4py.texi:30839: warning: @footnote should not appear on @deffn line mpi4py.texi:30851: warning: @footnote should not appear on @deffn line mpi4py.texi:30863: warning: @footnote should not appear on @deffn line mpi4py.texi:30875: warning: @footnote should not appear on @deffn line mpi4py.texi:30887: warning: @footnote should not appear on @deffn line mpi4py.texi:30899: warning: @footnote should not appear on @deffn line mpi4py.texi:30911: warning: @footnote should not appear on @deffn line mpi4py.texi:30923: warning: @footnote should not appear on @deffn line mpi4py.texi:30935: warning: @footnote should not appear on @deffn line mpi4py.texi:30947: warning: @footnote should not appear on @deffn line mpi4py.texi:30959: warning: @footnote should not appear on @deffn line mpi4py.texi:30971: warning: @footnote should not appear on @deffn line mpi4py.texi:30983: warning: @footnote should not appear on @deffn line mpi4py.texi:30995: warning: @footnote should not appear on @deffn line mpi4py.texi:31007: warning: @footnote should not appear on @deffn line mpi4py.texi:31019: warning: @footnote should not appear on @deffn line mpi4py.texi:31031: warning: @footnote should not appear on @deffn line mpi4py.texi:31043: warning: @footnote should not appear on @deffn line mpi4py.texi:31055: warning: @footnote should not appear on @deffn line mpi4py.texi:31067: warning: @footnote should not appear on @deffn line mpi4py.texi:31079: warning: @footnote should not appear on @deffn line mpi4py.texi:31091: warning: @footnote should not appear on @deffn line mpi4py.texi:31103: warning: @footnote should not appear on @deffn line mpi4py.texi:31115: warning: @footnote should not appear on @deffn line mpi4py.texi:31127: warning: @footnote should not appear on @deffn line mpi4py.texi:31139: warning: @footnote should not appear on @deffn line mpi4py.texi:31151: warning: @footnote should not appear on @deffn line mpi4py.texi:31163: warning: @footnote should not appear on @deffn line mpi4py.texi:31175: warning: @footnote should not appear on @deffn line mpi4py.texi:31187: warning: @footnote should not appear on @deffn line mpi4py.texi:31199: warning: @footnote should not appear on @deffn line mpi4py.texi:31211: warning: @footnote should not appear on @deffn line mpi4py.texi:31223: warning: @footnote should not appear on @deffn line mpi4py.texi:31235: warning: @footnote should not appear on @deffn line mpi4py.texi:31247: warning: @footnote should not appear on @deffn line mpi4py.texi:31259: warning: @footnote should not appear on @deffn line mpi4py.texi:31271: warning: @footnote should not appear on @deffn line mpi4py.texi:31283: warning: @footnote should not appear on @deffn line mpi4py.texi:31295: warning: @footnote should not appear on @deffn line mpi4py.texi:31307: warning: @footnote should not appear on @deffn line mpi4py.texi:31319: warning: @footnote should not appear on @deffn line mpi4py.texi:31331: warning: @footnote should not appear on @deffn line mpi4py.texi:31355: warning: @footnote should not appear on @deffn line mpi4py.texi:31367: warning: @footnote should not appear on @deffn line mpi4py.texi:31379: warning: @footnote should not appear on @deffn line mpi4py.texi:31391: warning: @footnote should not appear on @deffn line mpi4py.texi:31403: warning: @footnote should not appear on @deffn line mpi4py.texi:31415: warning: @footnote should not appear on @deffn line mpi4py.texi:31427: warning: @footnote should not appear on @deffn line mpi4py.texi:31439: warning: @footnote should not appear on @deffn line mpi4py.texi:31451: warning: @footnote should not appear on @deffn line mpi4py.texi:31463: warning: @footnote should not appear on @deffn line mpi4py.texi:31475: warning: @footnote should not appear on @deffn line mpi4py.texi:31487: warning: @footnote should not appear on @deffn line mpi4py.texi:31499: warning: @footnote should not appear on @deffn line mpi4py.texi:31511: warning: @footnote should not appear on @deffn line mpi4py.texi:31523: warning: @footnote should not appear on @deffn line mpi4py.texi:31535: warning: @footnote should not appear on @deffn line mpi4py.texi:31547: warning: @footnote should not appear on @deffn line mpi4py.texi:31559: warning: @footnote should not appear on @deffn line mpi4py.texi:31571: warning: @footnote should not appear on @deffn line mpi4py.texi:31583: warning: @footnote should not appear on @deffn line mpi4py.texi:31595: warning: @footnote should not appear on @deffn line mpi4py.texi:31607: warning: @footnote should not appear on @deffn line mpi4py.texi:31619: warning: @footnote should not appear on @deffn line mpi4py.texi:31631: warning: @footnote should not appear on @deffn line mpi4py.texi:31643: warning: @footnote should not appear on @deffn line mpi4py.texi:31655: warning: @footnote should not appear on @deffn line mpi4py.texi:31667: warning: @footnote should not appear on @deffn line mpi4py.texi:31679: warning: @footnote should not appear on @deffn line mpi4py.texi:31691: warning: @footnote should not appear on @deffn line mpi4py.texi:31703: warning: @footnote should not appear on @deffn line mpi4py.texi:31715: warning: @footnote should not appear on @deffn line mpi4py.texi:31727: warning: @footnote should not appear on @deffn line mpi4py.texi:31739: warning: @footnote should not appear on @deffn line mpi4py.texi:31751: warning: @footnote should not appear on @deffn line mpi4py.texi:31763: warning: @footnote should not appear on @deffn line mpi4py.texi:31775: warning: @footnote should not appear on @deffn line mpi4py.texi:31787: warning: @footnote should not appear on @deffn line mpi4py.texi:31799: warning: @footnote should not appear on @deffn line mpi4py.texi:31811: warning: @footnote should not appear on @deffn line mpi4py.texi:31823: warning: @footnote should not appear on @deffn line mpi4py.texi:31835: warning: @footnote should not appear on @deffn line mpi4py.texi:31847: warning: @footnote should not appear on @deffn line mpi4py.texi:31859: warning: @footnote should not appear on @deffn line make[3]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0/docs/source/_build/texinfo' Running Sphinx v7.4.7 loading translations [en]... done making output directory... done loading pickled environment... done [autosummary] generating autosummary for: changes.rst, citation.rst, develop.rst, guidelines.rst, index.rst, install.rst, intro.rst, license.rst, mpi4py.MPI.rst, mpi4py.bench.rst, ..., reference/mpi4py.MPI.WTIME_IS_GLOBAL.rst, reference/mpi4py.MPI.Win.rst, reference/mpi4py.MPI.Wtick.rst, reference/mpi4py.MPI.Wtime.rst, reference/mpi4py.MPI.buffer.rst, reference/mpi4py.MPI.get_vendor.rst, reference/mpi4py.MPI.memory.rst, reference/mpi4py.MPI.pickle.rst, reference/mpi4py.MPI.rst, tutorial.rst loading intersphinx inventory 'python' from /usr/share/doc/python3/html/objects.inv... loading intersphinx inventory 'numpy' from /usr/share/doc/python-numpy/html/objects.inv... loading intersphinx inventory 'dlpack' from https://dmlc.github.io/dlpack/latest/objects.inv... loading intersphinx inventory 'numba' from https://numba.readthedocs.io/en/stable/objects.inv... WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'https://dmlc.github.io/dlpack/latest/objects.inv' not fetchable due to : HTTPSConnectionPool(host='dmlc.github.io', port=443): Max retries exceeded with url: /dlpack/latest/objects.inv (Caused by ProxyError('Unable to connect to proxy', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))) WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'https://numba.readthedocs.io/en/stable/objects.inv' not fetchable due to : HTTPSConnectionPool(host='numba.readthedocs.io', port=443): Max retries exceeded with url: /en/stable/objects.inv (Caused by ProxyError('Unable to connect to proxy', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))) building [mo]: targets for 0 po files that are out of date writing output... building [latex]: all documents updating environment: 0 added, 0 changed, 0 removed reading sources... looking for now-outdated files... none found copying TeX support files... copying TeX support files... done processing mpi4py.tex... index intro overview tutorial mpi4py mpi4py.MPI mpi4py.typing mpi4py.futures mpi4py.util mpi4py.util.dtlib mpi4py.util.pkl5 mpi4py.util.pool mpi4py.util.sync mpi4py.run mpi4py.bench reference reference/mpi4py.MPI reference/mpi4py.MPI.BottomType reference/mpi4py.MPI.BufferAutomaticType reference/mpi4py.MPI.Cartcomm reference/mpi4py.MPI.Comm reference/mpi4py.MPI.Datatype reference/mpi4py.MPI.Distgraphcomm reference/mpi4py.MPI.Errhandler reference/mpi4py.MPI.File reference/mpi4py.MPI.Graphcomm reference/mpi4py.MPI.Grequest reference/mpi4py.MPI.Group reference/mpi4py.MPI.InPlaceType reference/mpi4py.MPI.Info reference/mpi4py.MPI.Intercomm reference/mpi4py.MPI.Intracomm reference/mpi4py.MPI.Message reference/mpi4py.MPI.Op reference/mpi4py.MPI.Pickle reference/mpi4py.MPI.Prequest reference/mpi4py.MPI.Request reference/mpi4py.MPI.Session reference/mpi4py.MPI.Status reference/mpi4py.MPI.Topocomm reference/mpi4py.MPI.Win reference/mpi4py.MPI.buffer reference/mpi4py.MPI.memory reference/mpi4py.MPI.Exception reference/mpi4py.MPI.Add_error_class reference/mpi4py.MPI.Add_error_code reference/mpi4py.MPI.Add_error_string reference/mpi4py.MPI.Aint_add reference/mpi4py.MPI.Aint_diff reference/mpi4py.MPI.Alloc_mem reference/mpi4py.MPI.Attach_buffer reference/mpi4py.MPI.Close_port reference/mpi4py.MPI.Compute_dims reference/mpi4py.MPI.Detach_buffer reference/mpi4py.MPI.Finalize reference/mpi4py.MPI.Flush_buffer reference/mpi4py.MPI.Free_mem reference/mpi4py.MPI.Get_address reference/mpi4py.MPI.Get_error_class reference/mpi4py.MPI.Get_error_string reference/mpi4py.MPI.Get_hw_resource_info reference/mpi4py.MPI.Get_library_version reference/mpi4py.MPI.Get_processor_name reference/mpi4py.MPI.Get_version reference/mpi4py.MPI.Iflush_buffer reference/mpi4py.MPI.Init reference/mpi4py.MPI.Init_thread reference/mpi4py.MPI.Is_finalized reference/mpi4py.MPI.Is_initialized reference/mpi4py.MPI.Is_thread_main reference/mpi4py.MPI.Lookup_name reference/mpi4py.MPI.Open_port reference/mpi4py.MPI.Pcontrol reference/mpi4py.MPI.Publish_name reference/mpi4py.MPI.Query_thread reference/mpi4py.MPI.Register_datarep reference/mpi4py.MPI.Remove_error_class reference/mpi4py.MPI.Remove_error_code reference/mpi4py.MPI.Remove_error_string reference/mpi4py.MPI.Unpublish_name reference/mpi4py.MPI.Wtick reference/mpi4py.MPI.Wtime reference/mpi4py.MPI.get_vendor reference/mpi4py.MPI.UNDEFINED reference/mpi4py.MPI.ANY_SOURCE reference/mpi4py.MPI.ANY_TAG reference/mpi4py.MPI.PROC_NULL reference/mpi4py.MPI.ROOT reference/mpi4py.MPI.BOTTOM reference/mpi4py.MPI.IN_PLACE reference/mpi4py.MPI.KEYVAL_INVALID reference/mpi4py.MPI.TAG_UB reference/mpi4py.MPI.IO reference/mpi4py.MPI.WTIME_IS_GLOBAL reference/mpi4py.MPI.UNIVERSE_SIZE reference/mpi4py.MPI.APPNUM reference/mpi4py.MPI.LASTUSEDCODE reference/mpi4py.MPI.WIN_BASE reference/mpi4py.MPI.WIN_SIZE reference/mpi4py.MPI.WIN_DISP_UNIT reference/mpi4py.MPI.WIN_CREATE_FLAVOR reference/mpi4py.MPI.WIN_FLAVOR reference/mpi4py.MPI.WIN_MODEL reference/mpi4py.MPI.SUCCESS reference/mpi4py.MPI.ERR_LASTCODE reference/mpi4py.MPI.ERR_TYPE reference/mpi4py.MPI.ERR_REQUEST reference/mpi4py.MPI.ERR_OP reference/mpi4py.MPI.ERR_GROUP reference/mpi4py.MPI.ERR_INFO reference/mpi4py.MPI.ERR_ERRHANDLER reference/mpi4py.MPI.ERR_SESSION reference/mpi4py.MPI.ERR_COMM reference/mpi4py.MPI.ERR_WIN reference/mpi4py.MPI.ERR_FILE reference/mpi4py.MPI.ERR_BUFFER reference/mpi4py.MPI.ERR_COUNT reference/mpi4py.MPI.ERR_TAG reference/mpi4py.MPI.ERR_RANK reference/mpi4py.MPI.ERR_ROOT reference/mpi4py.MPI.ERR_TRUNCATE reference/mpi4py.MPI.ERR_IN_STATUS reference/mpi4py.MPI.ERR_PENDING reference/mpi4py.MPI.ERR_TOPOLOGY reference/mpi4py.MPI.ERR_DIMS reference/mpi4py.MPI.ERR_ARG reference/mpi4py.MPI.ERR_OTHER reference/mpi4py.MPI.ERR_UNKNOWN reference/mpi4py.MPI.ERR_INTERN reference/mpi4py.MPI.ERR_KEYVAL reference/mpi4py.MPI.ERR_NO_MEM reference/mpi4py.MPI.ERR_INFO_KEY reference/mpi4py.MPI.ERR_INFO_VALUE reference/mpi4py.MPI.ERR_INFO_NOKEY reference/mpi4py.MPI.ERR_SPAWN reference/mpi4py.MPI.ERR_PORT reference/mpi4py.MPI.ERR_SERVICE reference/mpi4py.MPI.ERR_NAME reference/mpi4py.MPI.ERR_PROC_ABORTED reference/mpi4py.MPI.ERR_BASE reference/mpi4py.MPI.ERR_SIZE reference/mpi4py.MPI.ERR_DISP reference/mpi4py.MPI.ERR_ASSERT reference/mpi4py.MPI.ERR_LOCKTYPE reference/mpi4py.MPI.ERR_RMA_CONFLICT reference/mpi4py.MPI.ERR_RMA_SYNC reference/mpi4py.MPI.ERR_RMA_RANGE reference/mpi4py.MPI.ERR_RMA_ATTACH reference/mpi4py.MPI.ERR_RMA_SHARED reference/mpi4py.MPI.ERR_RMA_FLAVOR reference/mpi4py.MPI.ERR_BAD_FILE reference/mpi4py.MPI.ERR_NO_SUCH_FILE reference/mpi4py.MPI.ERR_FILE_EXISTS reference/mpi4py.MPI.ERR_FILE_IN_USE reference/mpi4py.MPI.ERR_AMODE reference/mpi4py.MPI.ERR_ACCESS reference/mpi4py.MPI.ERR_READ_ONLY reference/mpi4py.MPI.ERR_NO_SPACE reference/mpi4py.MPI.ERR_QUOTA reference/mpi4py.MPI.ERR_NOT_SAME reference/mpi4py.MPI.ERR_IO reference/mpi4py.MPI.ERR_UNSUPPORTED_OPERATION reference/mpi4py.MPI.ERR_UNSUPPORTED_DATAREP reference/mpi4py.MPI.ERR_CONVERSION reference/mpi4py.MPI.ERR_DUP_DATAREP reference/mpi4py.MPI.ERR_VALUE_TOO_LARGE reference/mpi4py.MPI.ERR_REVOKED reference/mpi4py.MPI.ERR_PROC_FAILED reference/mpi4py.MPI.ERR_PROC_FAILED_PENDING reference/mpi4py.MPI.ORDER_C reference/mpi4py.MPI.ORDER_FORTRAN reference/mpi4py.MPI.ORDER_F reference/mpi4py.MPI.TYPECLASS_INTEGER reference/mpi4py.MPI.TYPECLASS_REAL reference/mpi4py.MPI.TYPECLASS_COMPLEX reference/mpi4py.MPI.DISTRIBUTE_NONE reference/mpi4py.MPI.DISTRIBUTE_BLOCK reference/mpi4py.MPI.DISTRIBUTE_CYCLIC reference/mpi4py.MPI.DISTRIBUTE_DFLT_DARG reference/mpi4py.MPI.COMBINER_NAMED reference/mpi4py.MPI.COMBINER_DUP reference/mpi4py.MPI.COMBINER_CONTIGUOUS reference/mpi4py.MPI.COMBINER_VECTOR reference/mpi4py.MPI.COMBINER_HVECTOR reference/mpi4py.MPI.COMBINER_INDEXED reference/mpi4py.MPI.COMBINER_HINDEXED reference/mpi4py.MPI.COMBINER_INDEXED_BLOCK reference/mpi4py.MPI.COMBINER_HINDEXED_BLOCK reference/mpi4py.MPI.COMBINER_STRUCT reference/mpi4py.MPI.COMBINER_SUBARRAY reference/mpi4py.MPI.COMBINER_DARRAY reference/mpi4py.MPI.COMBINER_RESIZED reference/mpi4py.MPI.COMBINER_VALUE_INDEX reference/mpi4py.MPI.COMBINER_F90_INTEGER reference/mpi4py.MPI.COMBINER_F90_REAL reference/mpi4py.MPI.COMBINER_F90_COMPLEX reference/mpi4py.MPI.F_SOURCE reference/mpi4py.MPI.F_TAG reference/mpi4py.MPI.F_ERROR reference/mpi4py.MPI.F_STATUS_SIZE reference/mpi4py.MPI.IDENT reference/mpi4py.MPI.CONGRUENT reference/mpi4py.MPI.SIMILAR reference/mpi4py.MPI.UNEQUAL reference/mpi4py.MPI.CART reference/mpi4py.MPI.GRAPH reference/mpi4py.MPI.DIST_GRAPH reference/mpi4py.MPI.UNWEIGHTED reference/mpi4py.MPI.WEIGHTS_EMPTY reference/mpi4py.MPI.COMM_TYPE_SHARED reference/mpi4py.MPI.COMM_TYPE_HW_GUIDED reference/mpi4py.MPI.COMM_TYPE_HW_UNGUIDED reference/mpi4py.MPI.COMM_TYPE_RESOURCE_GUIDED reference/mpi4py.MPI.BSEND_OVERHEAD reference/mpi4py.MPI.BUFFER_AUTOMATIC reference/mpi4py.MPI.WIN_FLAVOR_CREATE reference/mpi4py.MPI.WIN_FLAVOR_ALLOCATE reference/mpi4py.MPI.WIN_FLAVOR_DYNAMIC reference/mpi4py.MPI.WIN_FLAVOR_SHARED reference/mpi4py.MPI.WIN_SEPARATE reference/mpi4py.MPI.WIN_UNIFIED reference/mpi4py.MPI.MODE_NOCHECK reference/mpi4py.MPI.MODE_NOSTORE reference/mpi4py.MPI.MODE_NOPUT reference/mpi4py.MPI.MODE_NOPRECEDE reference/mpi4py.MPI.MODE_NOSUCCEED reference/mpi4py.MPI.LOCK_EXCLUSIVE reference/mpi4py.MPI.LOCK_SHARED reference/mpi4py.MPI.MODE_RDONLY reference/mpi4py.MPI.MODE_WRONLY reference/mpi4py.MPI.MODE_RDWR reference/mpi4py.MPI.MODE_CREATE reference/mpi4py.MPI.MODE_EXCL reference/mpi4py.MPI.MODE_DELETE_ON_CLOSE reference/mpi4py.MPI.MODE_UNIQUE_OPEN reference/mpi4py.MPI.MODE_SEQUENTIAL reference/mpi4py.MPI.MODE_APPEND reference/mpi4py.MPI.SEEK_SET reference/mpi4py.MPI.SEEK_CUR reference/mpi4py.MPI.SEEK_END reference/mpi4py.MPI.DISPLACEMENT_CURRENT reference/mpi4py.MPI.DISP_CUR reference/mpi4py.MPI.THREAD_SINGLE reference/mpi4py.MPI.THREAD_FUNNELED reference/mpi4py.MPI.THREAD_SERIALIZED reference/mpi4py.MPI.THREAD_MULTIPLE reference/mpi4py.MPI.VERSION reference/mpi4py.MPI.SUBVERSION reference/mpi4py.MPI.MAX_PROCESSOR_NAME reference/mpi4py.MPI.MAX_ERROR_STRING reference/mpi4py.MPI.MAX_PORT_NAME reference/mpi4py.MPI.MAX_INFO_KEY reference/mpi4py.MPI.MAX_INFO_VAL reference/mpi4py.MPI.MAX_OBJECT_NAME reference/mpi4py.MPI.MAX_DATAREP_STRING reference/mpi4py.MPI.MAX_LIBRARY_VERSION_STRING reference/mpi4py.MPI.MAX_PSET_NAME_LEN reference/mpi4py.MPI.MAX_STRINGTAG_LEN reference/mpi4py.MPI.DATATYPE_NULL reference/mpi4py.MPI.PACKED reference/mpi4py.MPI.BYTE reference/mpi4py.MPI.AINT reference/mpi4py.MPI.OFFSET reference/mpi4py.MPI.COUNT reference/mpi4py.MPI.CHAR reference/mpi4py.MPI.WCHAR reference/mpi4py.MPI.SIGNED_CHAR reference/mpi4py.MPI.SHORT reference/mpi4py.MPI.INT reference/mpi4py.MPI.LONG reference/mpi4py.MPI.LONG_LONG reference/mpi4py.MPI.UNSIGNED_CHAR reference/mpi4py.MPI.UNSIGNED_SHORT reference/mpi4py.MPI.UNSIGNED reference/mpi4py.MPI.UNSIGNED_LONG reference/mpi4py.MPI.UNSIGNED_LONG_LONG reference/mpi4py.MPI.FLOAT reference/mpi4py.MPI.DOUBLE reference/mpi4py.MPI.LONG_DOUBLE reference/mpi4py.MPI.C_BOOL reference/mpi4py.MPI.INT8_T reference/mpi4py.MPI.INT16_T reference/mpi4py.MPI.INT32_T reference/mpi4py.MPI.INT64_T reference/mpi4py.MPI.UINT8_T reference/mpi4py.MPI.UINT16_T reference/mpi4py.MPI.UINT32_T reference/mpi4py.MPI.UINT64_T reference/mpi4py.MPI.C_COMPLEX reference/mpi4py.MPI.C_FLOAT_COMPLEX reference/mpi4py.MPI.C_DOUBLE_COMPLEX reference/mpi4py.MPI.C_LONG_DOUBLE_COMPLEX reference/mpi4py.MPI.CXX_BOOL reference/mpi4py.MPI.CXX_FLOAT_COMPLEX reference/mpi4py.MPI.CXX_DOUBLE_COMPLEX reference/mpi4py.MPI.CXX_LONG_DOUBLE_COMPLEX reference/mpi4py.MPI.SHORT_INT reference/mpi4py.MPI.INT_INT reference/mpi4py.MPI.TWOINT reference/mpi4py.MPI.LONG_INT reference/mpi4py.MPI.FLOAT_INT reference/mpi4py.MPI.DOUBLE_INT reference/mpi4py.MPI.LONG_DOUBLE_INT reference/mpi4py.MPI.CHARACTER reference/mpi4py.MPI.LOGICAL reference/mpi4py.MPI.INTEGER reference/mpi4py.MPI.REAL reference/mpi4py.MPI.DOUBLE_PRECISION reference/mpi4py.MPI.COMPLEX reference/mpi4py.MPI.DOUBLE_COMPLEX reference/mpi4py.MPI.LOGICAL1 reference/mpi4py.MPI.LOGICAL2 reference/mpi4py.MPI.LOGICAL4 reference/mpi4py.MPI.LOGICAL8 reference/mpi4py.MPI.INTEGER1 reference/mpi4py.MPI.INTEGER2 reference/mpi4py.MPI.INTEGER4 reference/mpi4py.MPI.INTEGER8 reference/mpi4py.MPI.INTEGER16 reference/mpi4py.MPI.REAL2 reference/mpi4py.MPI.REAL4 reference/mpi4py.MPI.REAL8 reference/mpi4py.MPI.REAL16 reference/mpi4py.MPI.COMPLEX4 reference/mpi4py.MPI.COMPLEX8 reference/mpi4py.MPI.COMPLEX16 reference/mpi4py.MPI.COMPLEX32 reference/mpi4py.MPI.UNSIGNED_INT reference/mpi4py.MPI.SIGNED_SHORT reference/mpi4py.MPI.SIGNED_INT reference/mpi4py.MPI.SIGNED_LONG reference/mpi4py.MPI.SIGNED_LONG_LONG reference/mpi4py.MPI.BOOL reference/mpi4py.MPI.SINT8_T reference/mpi4py.MPI.SINT16_T reference/mpi4py.MPI.SINT32_T reference/mpi4py.MPI.SINT64_T reference/mpi4py.MPI.F_BOOL reference/mpi4py.MPI.F_INT reference/mpi4py.MPI.F_FLOAT reference/mpi4py.MPI.F_DOUBLE reference/mpi4py.MPI.F_COMPLEX reference/mpi4py.MPI.F_FLOAT_COMPLEX reference/mpi4py.MPI.F_DOUBLE_COMPLEX reference/mpi4py.MPI.REQUEST_NULL reference/mpi4py.MPI.MESSAGE_NULL reference/mpi4py.MPI.MESSAGE_NO_PROC reference/mpi4py.MPI.OP_NULL reference/mpi4py.MPI.MAX reference/mpi4py.MPI.MIN reference/mpi4py.MPI.SUM reference/mpi4py.MPI.PROD reference/mpi4py.MPI.LAND reference/mpi4py.MPI.BAND reference/mpi4py.MPI.LOR reference/mpi4py.MPI.BOR reference/mpi4py.MPI.LXOR reference/mpi4py.MPI.BXOR reference/mpi4py.MPI.MAXLOC reference/mpi4py.MPI.MINLOC reference/mpi4py.MPI.REPLACE reference/mpi4py.MPI.NO_OP reference/mpi4py.MPI.GROUP_NULL reference/mpi4py.MPI.GROUP_EMPTY reference/mpi4py.MPI.INFO_NULL reference/mpi4py.MPI.INFO_ENV reference/mpi4py.MPI.ERRHANDLER_NULL reference/mpi4py.MPI.ERRORS_RETURN reference/mpi4py.MPI.ERRORS_ABORT reference/mpi4py.MPI.ERRORS_ARE_FATAL reference/mpi4py.MPI.SESSION_NULL reference/mpi4py.MPI.COMM_NULL reference/mpi4py.MPI.COMM_SELF reference/mpi4py.MPI.COMM_WORLD reference/mpi4py.MPI.WIN_NULL reference/mpi4py.MPI.FILE_NULL reference/mpi4py.MPI.pickle citation install develop guidelines license changes resolving references... /build/reproducible-path/mpi4py-4.0.0/docs/source/overview.rst:219: WARNING: 'any' reference target not found: dlpack:python-spec /build/reproducible-path/mpi4py-4.0.0/docs/source/overview.rst:219: WARNING: 'any' reference target not found: numba:cuda-array-interface /build/reproducible-path/mpi4py-4.0.0/docs/source/docstring of mpi4py.typing.SupportsDLPack:3: WARNING: undefined label: 'dlpack:python-spec' /build/reproducible-path/mpi4py-4.0.0/docs/source/docstring of mpi4py.typing.SupportsCAI:3: WARNING: undefined label: 'numba:cuda-array-interface' done writing... done build succeeded, 6 warnings. The LaTeX files are in _build/latex. Run 'make' in that directory to run these through (pdf)latex (use `make latexpdf' here to do that automatically). make[3]: Entering directory '/build/reproducible-path/mpi4py-4.0.0/docs/source/_build/latex' latexmk -pdf -dvi- -ps- 'mpi4py.tex' Rc files read: /etc/LatexMk latexmkrc Latexmk: This is Latexmk, John Collins, 7 Apr. 2024. Version 4.85. No existing .aux file, so I'll make a simple one, and require run of *latex. Latexmk: applying rule 'pdflatex'... Rule 'pdflatex': Reasons for rerun Category 'other': Rerun of 'pdflatex' forced or previously required: Reason or flag: 'Initial setup' ------------ Run number 1 of rule 'pdflatex' ------------ ------------ Running 'pdflatex -recorder "mpi4py.tex"' ------------ This is pdfTeX, Version 3.141592653-2.6-1.40.26 (TeX Live 2025/dev/Debian) (preloaded format=pdflatex) restricted \write18 enabled. entering extended mode (./mpi4py.tex LaTeX2e <2024-06-01> patch level 2 L3 programming layer <2024-08-16> (./sphinxhowto.cls Document Class: sphinxhowto 2019/12/01 v2.3.0 Document class (Sphinx howto) (/usr/share/texlive/texmf-dist/tex/latex/base/article.cls Document Class: article 2024/02/08 v1.4n Standard LaTeX document class (/usr/share/texlive/texmf-dist/tex/latex/base/size10.clo))) (/usr/share/texlive/texmf-dist/tex/latex/base/inputenc.sty) (/usr/share/texlive/texmf-dist/tex/latex/cmap/cmap.sty) (/usr/share/texlive/texmf-dist/tex/latex/base/fontenc.sty<>) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsmath.sty For additional information on amsmath, use the `?' option. (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amstext.sty (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsgen.sty)) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsbsy.sty) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsopn.sty)) (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/amssymb.sty (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/amsfonts.sty)) (/usr/share/texlive/texmf-dist/tex/generic/babel/babel.sty (/usr/share/texlive/texmf-dist/tex/generic/babel/txtbabel.def) (/usr/share/texlive/texmf-dist/tex/generic/babel-english/english.ldf)) (/usr/share/texlive/texmf-dist/tex/generic/babel/locale/en/babel-english.tex) (/usr/share/texmf/tex/latex/tex-gyre/tgtermes.sty (/usr/share/texlive/texmf-dist/tex/latex/kvoptions/kvoptions.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics/keyval.sty) (/usr/share/texlive/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty) (/usr/share/texlive/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty))) (/usr/share/texmf/tex/latex/tex-gyre/tgheros.sty) (/usr/share/texlive/texmf-dist/tex/latex/fncychap/fncychap.sty) (./sphinx.sty (/usr/share/texlive/texmf-dist/tex/latex/xcolor/xcolor.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics-cfg/color.cfg) (/usr/share/texlive/texmf-dist/tex/latex/graphics-def/pdftex.def) (/usr/share/texlive/texmf-dist/tex/latex/graphics/mathcolor.ltx)) (./sphinxoptionshyperref.sty) (./sphinxoptionsgeometry.sty) (/usr/share/texlive/texmf-dist/tex/latex/base/textcomp.sty) (/usr/share/texlive/texmf-dist/tex/latex/float/float.sty) (/usr/share/texlive/texmf-dist/tex/latex/wrapfig/wrapfig.sty) (/usr/share/texlive/texmf-dist/tex/latex/capt-of/capt-of.sty) (/usr/share/texlive/texmf-dist/tex/latex/tools/multicol.sty) (/usr/share/texlive/texmf-dist/tex/latex/graphics/graphicx.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics/graphics.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics/trig.sty) (/usr/share/texlive/texmf-dist/tex/latex/graphics-cfg/graphics.cfg))) (./sphinxlatexgraphics.sty) (./sphinxpackageboxes.sty (/usr/share/texlive/texmf-dist/tex/latex/pict2e/pict2e.sty (/usr/share/texlive/texmf-dist/tex/latex/pict2e/pict2e.cfg) (/usr/share/texlive/texmf-dist/tex/latex/pict2e/p2e-pdftex.def)) (/usr/share/texlive/texmf-dist/tex/latex/ellipse/ellipse.sty)) (./sphinxlatexadmonitions.sty (/usr/share/texlive/texmf-dist/tex/latex/framed/framed.sty)) (./sphinxlatexliterals.sty (/usr/share/texlive/texmf-dist/tex/latex/fancyvrb/fancyvrb.sty) (/usr/share/texlive/texmf-dist/tex/latex/base/alltt.sty) (/usr/share/texlive/texmf-dist/tex/latex/upquote/upquote.sty) (/usr/share/texlive/texmf-dist/tex/latex/needspace/needspace.sty)) (./sphinxlatexshadowbox.sty) (./sphinxlatexcontainers.sty) (./sphinxhighlight.sty) (./sphinxlatextables.sty (/usr/share/texlive/texmf-dist/tex/latex/tabulary/tabulary.sty (/usr/share/texlive/texmf-dist/tex/latex/tools/array.sty)) (/usr/share/texlive/texmf-dist/tex/latex/tools/longtable.sty) (/usr/share/texlive/texmf-dist/tex/latex/varwidth/varwidth.sty) (/usr/share/texlive/texmf-dist/tex/latex/colortbl/colortbl.sty) (/usr/share/texlive/texmf-dist/tex/latex/booktabs/booktabs.sty)) (./sphinxlatexnumfig.sty) (./sphinxlatexlists.sty) (./sphinxpackagefootnote.sty ) (./sphinxlatexindbibtoc.sty (/usr/share/texlive/texmf-dist/tex/latex/base/makeidx.sty)) (./sphinxlatexstylepage.sty (/usr/share/texlive/texmf-dist/tex/latex/parskip/parskip.sty (/usr/share/texlive/texmf-dist/tex/latex/parskip/parskip-2001-04-09.sty)) (/usr/share/texlive/texmf-dist/tex/latex/fancyhdr/fancyhdr.sty)) (./sphinxlatexstyleheadings.sty (/usr/share/texlive/texmf-dist/tex/latex/titlesec/titlesec.sty)) (./sphinxlatexstyletext.sty) (./sphinxlatexobjects.sty)) (/usr/share/texlive/texmf-dist/tex/latex/geometry/geometry.sty (/usr/share/texlive/texmf-dist/tex/generic/iftex/ifvtex.sty (/usr/share/texlive/texmf-dist/tex/generic/iftex/iftex.sty))) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/hyperref.sty (/usr/share/texlive/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty) (/usr/share/texlive/texmf-dist/tex/generic/pdfescape/pdfescape.sty (/usr/share/texlive/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty (/usr/share/texlive/texmf-dist/tex/generic/infwarerr/infwarerr.sty))) (/usr/share/texlive/texmf-dist/tex/latex/hycolor/hycolor.sty) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/nameref.sty (/usr/share/texlive/texmf-dist/tex/latex/refcount/refcount.sty) (/usr/share/texlive/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty)) (/usr/share/texlive/texmf-dist/tex/latex/etoolbox/etoolbox.sty) (/usr/share/texlive/texmf-dist/tex/generic/stringenc/stringenc.sty) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/pd1enc.def) (/usr/share/texlive/texmf-dist/tex/generic/intcalc/intcalc.sty) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/puenc.def) (/usr/share/texlive/texmf-dist/tex/latex/url/url.sty) (/usr/share/texlive/texmf-dist/tex/generic/bitset/bitset.sty (/usr/share/texlive/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty)) (/usr/share/texlive/texmf-dist/tex/latex/base/atbegshi-ltx.sty)) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/hpdftex.def (/usr/share/texlive/texmf-dist/tex/latex/base/atveryend-ltx.sty) (/usr/share/texlive/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty (/usr/share/texlive/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty))) (/usr/share/texlive/texmf-dist/tex/latex/hypcap/hypcap.sty (/usr/share/texlive/texmf-dist/tex/latex/letltxmacro/letltxmacro.sty)) (./sphinxmessages.sty) Writing index file mpi4py.idx (/usr/share/texmf/tex/latex/tex-gyre/t1qtm.fd) (/usr/share/texlive/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def) LaTeX Warning: Unused global option(s): [a4]. (./mpi4py.aux) (/usr/share/texlive/texmf-dist/tex/context/base/mkii/supp-pdf.mkii [Loading MPS to PDF converter (version 2006.09.02).] ) (/usr/share/texlive/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty (/usr/share/texlive/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg)) *geometry* driver: auto-detecting *geometry* detected driver: pdftex (/usr/share/texmf/tex/latex/tex-gyre/t1qhv.fd)<><><><> (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/umsa.fd) (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/umsb.fd) No file mpi4py.toc. [1{/var/lib/texmf/fonts/map/pdftex/updmap/pdftex.map}{/usr/share/texmf/fonts/en c/dvips/tex-gyre/q-ec.enc}] LaTeX Warning: Citation `intro:mpi-using' on page 2 undefined on input line 157 . LaTeX Warning: Citation `intro:mpi-ref' on page 2 undefined on input line 157. LaTeX Warning: Citation `intro:mpi-std1' on page 2 undefined on input line 165. LaTeX Warning: Citation `intro:mpi-std2' on page 2 undefined on input line 165. LaTeX Warning: Citation `intro:mpi-mpich' on page 2 undefined on input line 169 . LaTeX Warning: Citation `intro:mpi-openmpi' on page 2 undefined on input line 1 69. LaTeX Warning: Citation `intro:hinsen97' on page 2 undefined on input line 192. LaTeX Warning: Citation `intro:beazley97' on page 2 undefined on input line 193 . (/usr/share/texmf/tex/latex/tex-gyre/ts1qtm.fd) [2{/usr/share/texmf/fonts/enc/dvips/tex-gyre/q-ts1.enc}] [3] (/usr/share/texlive/texmf-dist/tex/latex/txfonts/t1txtt.fd) LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 4 undefined on input line 404. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 4 undefined on input line 405. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 4 undefined on input line 405. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 4 undefined on input line 405. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Is_in ter' on page 4 undefined on input line 406. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Is_in tra' on page 4 undefined on input line 406. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_SELF:mpi4py.MPI.COMM_ SELF' on page 4 undefined on input line 413. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_WORLD:mpi4py.MPI.COMM _WORLD' on page 4 undefined on input line 413. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_s ize' on page 4 undefined on input line 418. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_r ank' on page 4 undefined on input line 419. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_g roup' on page 4 undefined on input line 420. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 4 undefined on input line 421. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 4 undefined on input line 421. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Uni on' on page 4 undefined on input line 422. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Int ersection' on page 4 undefined on input line 422. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Dif ference' on page 4 undefined on input line 422. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Creat e' on page 4 undefined on input line 424. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_group' on page 4 undefined on input line 424. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Clone ' on page 4 undefined on input line 427. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Dup' on page 4 undefined on input line 428. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Split ' on page 4 undefined on input line 428. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_intercomm' on page 4 undefined on input line 429. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.Merge' on page 4 undefined on input line 429. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm' on page 4 undefined on input line 432. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm' on page 4 undefined on input line 432. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Distgraphcomm:mpi4py.MPI.D istgraphcomm' on page 4 undefined on input line 432. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 4 undefined on input line 433. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_cart' on page 4 undefined on input line 435. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_graph' on page 4 undefined on input line 436. [4] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Send' on page 5 undefined on input line 466. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Recv' on page 5 undefined on input line 466. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Sendr ecv' on page 5 undefined on input line 466. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 5 undefined on input line 468. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 5 undefined on input line 468. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.send' on page 5 undefined on input line 470. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.recv' on page 5 undefined on input line 470. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.sendr ecv' on page 5 undefined on input line 470. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Isend ' on page 5 undefined on input line 490. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Irecv ' on page 5 undefined on input line 490. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 5 undefined on input line 492. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Test' on page 5 undefined on input line 493. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Wait' on page 5 undefined on input line 494. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Cancel' on page 5 undefined on input line 494. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 5 undefined on input line 495. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Send_ init' on page 5 undefined on input line 514. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Recv_ init' on page 5 undefined on input line 514. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 5 undefined on input line 516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 5 undefined on input line 517. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Start' on page 5 undefined on input line 518. [5] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Bcast ' on page 6 undefined on input line 566. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Scatt er' on page 6 undefined on input line 566. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Gathe r' on page 6 undefined on input line 566. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allga ther' on page 6 undefined on input line 567. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allto all' on page 6 undefined on input line 567. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.bcast ' on page 6 undefined on input line 569. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.scatt er' on page 6 undefined on input line 569. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.gathe r' on page 6 undefined on input line 569. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.allga ther' on page 6 undefined on input line 569. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.allto all' on page 6 undefined on input line 570. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Scatt erv' on page 6 undefined on input line 572. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Gathe rv' on page 6 undefined on input line 572. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allga therv' on page 6 undefined on input line 572. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allto allv' on page 6 undefined on input line 573. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allto allw' on page 6 undefined on input line 573. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Reduc e' on page 6 undefined on input line 578. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Reduc e_scatter' on page 6 undefined on input line 578. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allre duce' on page 6 undefined on input line 578. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Scan' on page 6 undefined on input line 579. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Exscan' on page 6 undefined on input line 579. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.reduc e' on page 6 undefined on input line 580. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.allre duce' on page 6 undefined on input line 580. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.scan' on page 6 undefined on input line 580. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.exscan' on page 6 undefined on input line 581. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SUM:mpi4py.MPI.SUM' on pag e 6 undefined on input line 583. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.PROD:mpi4py.MPI.PROD' on p age 6 undefined on input line 583. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX:mpi4py.MPI.MAX' on pag e 6 undefined on input line 583. LaTeX Warning: Hyper reference `tutorial::doc' on page 6 undefined on input lin e 614. [6] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Spawn' on page 7 undefined on input line 650. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 7 undefined on input line 651. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_p arent' on page 7 undefined on input line 654. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Open_port:mpi4py.MPI.Open_ port' on page 7 undefined on input line 661. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Publish_name:mpi4py.MPI.Pu blish_name' on page 7 undefined on input line 662. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Accept' on page 7 undefined on input line 663. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Lookup_name:mpi4py.MPI.Loo kup_name' on page 7 undefined on input line 664. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Connect' on page 7 undefined on input line 666. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Accept' on page 7 undefined on input line 666. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Connect' on page 7 undefined on input line 667. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 7 undefined on input line 667. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Disco nnect' on page 7 undefined on input line 669. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Unpublish_name:mpi4py.MPI. Unpublish_name' on page 7 undefined on input line 671. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Close_port:mpi4py.MPI.Clos e_port' on page 7 undefined on input line 671. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 7 undefined on input line 697. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Create' on page 7 undefined on input line 698. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Free' o n page 7 undefined on input line 700. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Put' on page 7 undefined on input line 704. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Get' on page 7 undefined on input line 705. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Accumul ate' on page 7 undefined on input line 705. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 7 undefined on input line 705. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Start' on page 7 undefined on input line 714. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Complet e' on page 7 undefined on input line 714. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Post' o n page 7 undefined on input line 715. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Wait' o n page 7 undefined on input line 715. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Fence' on page 7 undefined on input line 717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Lock' o n page 7 undefined on input line 718. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Unlock' on page 7 undefined on input line 718. [7] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 8 undefined on input line 761. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Open' on page 8 undefined on input line 762. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Close ' on page 8 undefined on input line 764. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Delet e' on page 8 undefined on input line 765. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Set_v iew' on page 8 undefined on input line 772. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Get_v iew' on page 8 undefined on input line 772. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 8 undefined on input line 790. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init_thread:mpi4py.MPI.Ini t_thread' on page 8 undefined on input line 790. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Finalize:mpi4py.MPI.Finali ze' on page 8 undefined on input line 790. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Is_initialized:mpi4py.MPI. Is_initialized' on page 8 undefined on input line 792. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Is_finalized:mpi4py.MPI.Is _finalized' on page 8 undefined on input line 792. LaTeX Warning: Hyper reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 8 undefined on input line 798. LaTeX Warning: Hyper reference `mpi4py:module-mpi4py' on page 8 undefined on in put line 799. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 8 undefined on input line 800. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init_thread:mpi4py.MPI.Ini t_thread' on page 8 undefined on input line 800. LaTeX Warning: Hyper reference `mpi4py:module-mpi4py' on page 8 undefined on in put line 809. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Finalize:mpi4py.MPI.Finali ze' on page 8 undefined on input line 810. [8] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_version:mpi4py.MPI.Get _version' on page 9 undefined on input line 820. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_processor_name:mpi4py. MPI.Get_processor_name' on page 9 undefined on input line 825. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_a ttr' on page 9 undefined on input line 831. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_WORLD:mpi4py.MPI.COMM _WORLD' on page 9 undefined on input line 832. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Wtime:mpi4py.MPI.Wtime' on page 9 undefined on input line 840. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Wtick:mpi4py.MPI.Wtick' on page 9 undefined on input line 841. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRORS_RETURN:mpi4py.MPI.E RRORS_RETURN' on page 9 undefined on input line 849. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRORS_ARE_FATAL:mpi4py.MP I.ERRORS_ARE_FATAL' on page 9 undefined on input line 849. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Set_e rrhandler' on page 9 undefined on input line 850. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_e rrhandler' on page 9 undefined on input line 851. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Creat e_errhandler' on page 9 undefined on input line 852. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRORS_RETURN:mpi4py.MPI.E RRORS_RETURN' on page 9 undefined on input line 855. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion' on page 9 undefined on input line 857. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRORS_RETURN:mpi4py.MPI.E RRORS_RETURN' on page 9 undefined on input line 863. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_SELF:mpi4py.MPI.COMM_ SELF' on page 9 undefined on input line 864. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_WORLD:mpi4py.MPI.COMM _WORLD' on page 9 undefined on input line 864. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 9 undefined on input line 865. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 9 undefined on input line 865. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 9 undefined on input line 865. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRORS_ARE_FATAL:mpi4py.MP I.ERRORS_ARE_FATAL' on page 9 undefined on input line 867. [9] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.send' on page 10 undefined on input line 917. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.recv' on page 10 undefined on input line 917. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.bcast ' on page 10 undefined on input line 917. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.scatt er' on page 10 undefined on input line 917. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.gathe r' on page 10 undefined on input line 918. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.isend ' on page 10 undefined on input line 923. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.irecv ' on page 10 undefined on input line 923. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 10 undefined on input line 923. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .test' on page 10 undefined on input line 925. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .wait' on page 10 undefined on input line 925. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.recv' on page 10 undefined on input line 928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.irecv ' on page 10 undefined on input line 928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.recv' on page 10 undefined on input line 932. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.irecv ' on page 10 undefined on input line 932. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.scatt er' on page 10 undefined on input line 936. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.gathe r' on page 10 undefined on input line 936. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.allga ther' on page 10 undefined on input line 937. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.allto all' on page 10 undefined on input line 937. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.size' on page 10 undefined on input line 938. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.size' on page 10 undefined on input line 939. LaTeX Warning: Hyper reference `mpi4py:envvar-MPI4PY_PICKLE_PROTOCOL' on page 1 0 undefined on input line 946. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.P ROTOCOL' on page 10 undefined on input line 948. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.pickle:mpi4py.MPI.pickle' on page 10 undefined on input line 949. LaTeX Warning: Hyper reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 10 undefined on input line 949. Underfull \hbox (badness 6691) in paragraph at lines 942--951 []\T1/qtm/m/it/10 MPI for Python \T1/qtm/m/n/10 uses the \T1/qtm/b/n/10 high-es t [][]\T1/qtm/m/n/10 pro-to-col ver-sion[][] avail-able in the Python run-time (see the LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Send' on page 10 undefined on input line 959. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Recv' on page 10 undefined on input line 959. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Bcast ' on page 10 undefined on input line 959. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Scatt er' on page 10 undefined on input line 959. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Gathe r' on page 10 undefined on input line 960. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Scatt erv' on page 10 undefined on input line 970. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Gathe rv' on page 10 undefined on input line 970. [10] (/usr/share/texlive/texmf-dist/tex/latex/txfonts/ts1txtt.fd) [11] [12] [13] [14] [15] [16] [17] [18] LaTeX Warning: Hyper reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 19 undefined on input line 1563. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.initialize' on page 19 undefined on input line 1577. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.threads' on page 19 und efined on input line 1584. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.thread_level' on page 1 9 undefined on input line 1591. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.finalize' on page 19 un defined on input line 1598. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.fast_reduce' on page 19 undefined on input line 1605. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.recv_mprobe' on page 19 undefined on input line 1612. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.irecv_bufsz' on page 19 undefined on input line 1619. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.irecv ' on page 19 undefined on input line 1622. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.errors' on page 19 unde fined on input line 1626. [19] LaTeX Warning: Hyper reference `mpi4py:envvar-MPI4PY_RC_INITIALIZE' on page 20 undefined on input line 1660. LaTeX Warning: Hyper reference `mpi4py:envvar-MPI4PY_RC_THREADS' on page 20 und efined on input line 1692. LaTeX Warning: Hyper reference `mpi4py:envvar-MPI4PY_RC_THREAD_LEVEL' on page 2 0 undefined on input line 1728. LaTeX Warning: Hyper reference `mpi4py:envvar-MPI4PY_RC_FINALIZE' on page 20 un defined on input line 1760. [20] LaTeX Warning: Hyper reference `mpi4py:envvar-MPI4PY_RC_FAST_REDUCE' on page 21 undefined on input line 1792. LaTeX Warning: Hyper reference `mpi4py:envvar-MPI4PY_RC_RECV_MPROBE' on page 21 undefined on input line 1824. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.irecv ' on page 21 undefined on input line 1840. LaTeX Warning: Hyper reference `mpi4py:envvar-MPI4PY_RC_IRECV_BUFSZ' on page 21 undefined on input line 1856. LaTeX Warning: Hyper reference `mpi4py:envvar-MPI4PY_RC_ERRORS' on page 21 unde fined on input line 1895. [21] LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc' on page 22 undefined o n input line 1907. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc' on page 22 undefined o n input line 1927. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.pickle:mpi4py.MPI.pickle' on page 22 undefined on input line 1927. LaTeX Warning: Hyper reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 22 undefined on input line 1928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 22 undefined on input line 1955. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.initialize' on page 22 undefined on input line 1961. [22{/usr/share/texlive/texmf-dist/fonts/enc/dvips/base/8r.enc}] LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.finalize' on page 23 un defined on input line 2000. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.threads' on page 23 und efined on input line 2035. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.thread_level' on page 2 3 undefined on input line 2071. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.fast_reduce' on page 23 undefined on input line 2106. [23] LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.recv_mprobe' on page 24 undefined on input line 2141. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.irecv ' on page 24 undefined on input line 2167. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.irecv_bufsz' on page 24 undefined on input line 2173. LaTeX Warning: Hyper reference `mpi4py:mpi4py.mpi4py.rc.errors' on page 24 unde fined on input line 2208. [24] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.P ROTOCOL' on page 25 undefined on input line 2244. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.pickle:mpi4py.MPI.pickle' on page 25 undefined on input line 2245. LaTeX Warning: Hyper reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 25 undefined on input line 2245. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.T HRESHOLD' on page 25 undefined on input line 2281. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.pickle:mpi4py.MPI.pickle' on page 25 undefined on input line 2282. LaTeX Warning: Hyper reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 25 undefined on input line 2282. [25] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Grequest:mpi4py.MPI.Greque st' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session ' on page 26 undefined on input line 2462. Package tabulary Warning: No suitable columns! on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Grequest:mpi4py.MPI.Greque st' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session ' on page 26 undefined on input line 2462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Distgraphcomm:mpi4py.MPI.D istgraphcomm' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 26 undefined on input line 2531. Package tabulary Warning: No suitable columns! on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Distgraphcomm:mpi4py.MPI.D istgraphcomm' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 26 undefined on input line 2531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 26 undefined on input line 2551. Package tabulary Warning: No suitable columns! on input line 2551. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 26 undefined on input line 2551. [26] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 27 undefined on input line 2571. Package tabulary Warning: No suitable columns! on input line 2571. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 27 undefined on input line 2571. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 27 undefined on input line 2598. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion' on page 27 undefined on input line 2598. Package tabulary Warning: No suitable columns! on input line 2598. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 27 undefined on input line 2598. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion' on page 27 undefined on input line 2598. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle' on page 27 undefined on input line 2625. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 27 undefined on input line 2625. Package tabulary Warning: No suitable columns! on input line 2625. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle' on page 27 undefined on input line 2625. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 27 undefined on input line 2625. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_version:mpi4py.MPI.Get _version' on page 27 undefined on input line 2655. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_library_version:mpi4py .MPI.Get_library_version' on page 27 undefined on input line 2655. Package tabulary Warning: No suitable columns! on input line 2655. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_version:mpi4py.MPI.Get _version' on page 27 undefined on input line 2655. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_library_version:mpi4py .MPI.Get_library_version' on page 27 undefined on input line 2655. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init_thread:mpi4py.MPI.Ini t_thread' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Finalize:mpi4py.MPI.Finali ze' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Is_initialized:mpi4py.MPI. Is_initialized' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Is_finalized:mpi4py.MPI.Is _finalized' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Finalize:mpi4py.MPI.Finali ze' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Query_thread:mpi4py.MPI.Qu ery_thread' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Is_thread_main:mpi4py.MPI. Is_thread_main' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init_thread:mpi4py.MPI.Ini t_thread' on page 27 undefined on input line 2717. Underfull \hbox (badness 10000) in paragraph at lines 2717--2717 []|\T1/qtm/m/n/10 In-di-cate whether this thread called \T1/txtt/m/sl/10 Init \ T1/qtm/m/n/10 or Package tabulary Warning: No suitable columns! on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init_thread:mpi4py.MPI.Ini t_thread' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Finalize:mpi4py.MPI.Finali ze' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Is_initialized:mpi4py.MPI. Is_initialized' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Is_finalized:mpi4py.MPI.Is _finalized' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Finalize:mpi4py.MPI.Finali ze' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Query_thread:mpi4py.MPI.Qu ery_thread' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Is_thread_main:mpi4py.MPI. Is_thread_main' on page 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 27 undefined on input line 2717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init_thread:mpi4py.MPI.Ini t_thread' on page 27 undefined on input line 2717. Underfull \hbox (badness 10000) in paragraph at lines 2717--2717 []|\T1/qtm/m/n/10 In-di-cate whether this thread called \T1/txtt/m/sl/10 Init \ T1/qtm/m/n/10 or [27] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Alloc_mem:mpi4py.MPI.Alloc _mem' on page 28 undefined on input line 2744. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Free_mem:mpi4py.MPI.Free_m em' on page 28 undefined on input line 2744. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Alloc_mem:mpi4py.MPI.Alloc _mem' on page 28 undefined on input line 2744. Package tabulary Warning: No suitable columns! on input line 2744. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Alloc_mem:mpi4py.MPI.Alloc _mem' on page 28 undefined on input line 2744. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Free_mem:mpi4py.MPI.Free_m em' on page 28 undefined on input line 2744. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Alloc_mem:mpi4py.MPI.Alloc _mem' on page 28 undefined on input line 2744. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_address:mpi4py.MPI.Get _address' on page 28 undefined on input line 2778. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Aint_add:mpi4py.MPI.Aint_a dd' on page 28 undefined on input line 2778. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Aint_diff:mpi4py.MPI.Aint_ diff' on page 28 undefined on input line 2778. Package tabulary Warning: No suitable columns! on input line 2778. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_address:mpi4py.MPI.Get _address' on page 28 undefined on input line 2778. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Aint_add:mpi4py.MPI.Aint_a dd' on page 28 undefined on input line 2778. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Aint_diff:mpi4py.MPI.Aint_ diff' on page 28 undefined on input line 2778. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Wtick:mpi4py.MPI.Wtick' on page 28 undefined on input line 2805. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Wtime:mpi4py.MPI.Wtime' on page 28 undefined on input line 2805. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Wtime:mpi4py.MPI.Wtime' on page 28 undefined on input line 2805. Package tabulary Warning: No suitable columns! on input line 2805. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Wtick:mpi4py.MPI.Wtick' on page 28 undefined on input line 2805. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Wtime:mpi4py.MPI.Wtime' on page 28 undefined on input line 2805. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Wtime:mpi4py.MPI.Wtime' on page 28 undefined on input line 2805. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_error_class:mpi4py.MPI .Get_error_class' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_error_string:mpi4py.MP I.Get_error_string' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Add_error_class:mpi4py.MPI .Add_error_class' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Add_error_code:mpi4py.MPI. Add_error_code' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Add_error_string:mpi4py.MP I.Add_error_string' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Remove_error_class:mpi4py. MPI.Remove_error_class' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Remove_error_code:mpi4py.M PI.Remove_error_code' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Remove_error_string:mpi4py .MPI.Remove_error_string' on page 28 undefined on input line 2874. Package tabulary Warning: No suitable columns! on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_error_class:mpi4py.MPI .Get_error_class' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_error_string:mpi4py.MP I.Get_error_string' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Add_error_class:mpi4py.MPI .Add_error_class' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Add_error_code:mpi4py.MPI. Add_error_code' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Add_error_string:mpi4py.MP I.Add_error_string' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Remove_error_class:mpi4py. MPI.Remove_error_class' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Remove_error_code:mpi4py.M PI.Remove_error_code' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Remove_error_string:mpi4py .MPI.Remove_error_string' on page 28 undefined on input line 2874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Open_port:mpi4py.MPI.Open_ port' on page 28 undefined on input line 2922. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Close_port:mpi4py.MPI.Clos e_port' on page 28 undefined on input line 2922. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Publish_name:mpi4py.MPI.Pu blish_name' on page 28 undefined on input line 2922. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Unpublish_name:mpi4py.MPI. Unpublish_name' on page 28 undefined on input line 2922. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Lookup_name:mpi4py.MPI.Loo kup_name' on page 28 undefined on input line 2922. Package tabulary Warning: No suitable columns! on input line 2922. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Open_port:mpi4py.MPI.Open_ port' on page 28 undefined on input line 2922. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Close_port:mpi4py.MPI.Clos e_port' on page 28 undefined on input line 2922. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Publish_name:mpi4py.MPI.Pu blish_name' on page 28 undefined on input line 2922. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Unpublish_name:mpi4py.MPI. Unpublish_name' on page 28 undefined on input line 2922. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Lookup_name:mpi4py.MPI.Loo kup_name' on page 28 undefined on input line 2922. [28] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Attach_buffer:mpi4py.MPI.A ttach_buffer' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Detach_buffer:mpi4py.MPI.D etach_buffer' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Flush_buffer:mpi4py.MPI.Fl ush_buffer' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Iflush_buffer:mpi4py.MPI.I flush_buffer' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Compute_dims:mpi4py.MPI.Co mpute_dims' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_processor_name:mpi4py. MPI.Get_processor_name' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Register_datarep:mpi4py.MP I.Register_datarep' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pcontrol:mpi4py.MPI.Pcontr ol' on page 29 undefined on input line 2991. Package tabulary Warning: No suitable columns! on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Attach_buffer:mpi4py.MPI.A ttach_buffer' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Detach_buffer:mpi4py.MPI.D etach_buffer' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Flush_buffer:mpi4py.MPI.Fl ush_buffer' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Iflush_buffer:mpi4py.MPI.I flush_buffer' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Compute_dims:mpi4py.MPI.Co mpute_dims' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_processor_name:mpi4py. MPI.Get_processor_name' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Register_datarep:mpi4py.MP I.Register_datarep' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pcontrol:mpi4py.MPI.Pcontr ol' on page 29 undefined on input line 2991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.get_vendor:mpi4py.MPI.get_ vendor' on page 29 undefined on input line 3011. Package tabulary Warning: No suitable columns! on input line 3011. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.get_vendor:mpi4py.MPI.get_ vendor' on page 29 undefined on input line 3011. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNDEFINED:mpi4py.MPI.UNDEF INED' on page 29 undefined on input line 3047. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ANY_SOURCE:mpi4py.MPI.ANY_ SOURCE' on page 29 undefined on input line 3054. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ANY_TAG:mpi4py.MPI.ANY_TAG ' on page 29 undefined on input line 3061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.PROC_NULL:mpi4py.MPI.PROC_ NULL' on page 29 undefined on input line 3068. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ROOT:mpi4py.MPI.ROOT' on p age 29 undefined on input line 3075. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BOTTOM:mpi4py.MPI.BOTTOM' on page 29 undefined on input line 3082. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BottomType:mpi4py.MPI.Bott omType' on page 29 undefined on input line 3085. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.IN_PLACE:mpi4py.MPI.IN_PLA CE' on page 29 undefined on input line 3089. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.InPlaceType:mpi4py.MPI.InP laceType' on page 29 undefined on input line 3092. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BUFFER_AUTOMATIC:mpi4py.MP I.BUFFER_AUTOMATIC' on page 29 undefined on input line 3096. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BufferAutomaticType:mpi4py .MPI.BufferAutomaticType' on page 29 undefined on input line 3099. Underfull \hbox (badness 10000) in paragraph at lines 3098--3101 []|\T1/qtm/m/n/10 Con-stant \T1/txtt/m/n/10 BUFFER_AUTOMATIC \T1/qtm/m/n/10 of type LaTeX Warning: Hyper reference `reference/mpi4py.MPI.KEYVAL_INVALID:mpi4py.MPI. KEYVAL_INVALID' on page 29 undefined on input line 3103. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.TAG_UB:mpi4py.MPI.TAG_UB' on page 29 undefined on input line 3110. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.IO:mpi4py.MPI.IO' on page 29 undefined on input line 3117. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WTIME_IS_GLOBAL:mpi4py.MPI .WTIME_IS_GLOBAL' on page 29 undefined on input line 3124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNIVERSE_SIZE:mpi4py.MPI.U NIVERSE_SIZE' on page 29 undefined on input line 3131. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.APPNUM:mpi4py.MPI.APPNUM' on page 29 undefined on input line 3138. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LASTUSEDCODE:mpi4py.MPI.LA STUSEDCODE' on page 29 undefined on input line 3145. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_BASE:mpi4py.MPI.WIN_BA SE' on page 29 undefined on input line 3152. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_SIZE:mpi4py.MPI.WIN_SI ZE' on page 29 undefined on input line 3159. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_DISP_UNIT:mpi4py.MPI.W IN_DISP_UNIT' on page 29 undefined on input line 3166. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_CREATE_FLAVOR:mpi4py.M PI.WIN_CREATE_FLAVOR' on page 29 undefined on input line 3173. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_FLAVOR:mpi4py.MPI.WIN_ FLAVOR' on page 29 undefined on input line 3180. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_MODEL:mpi4py.MPI.WIN_M ODEL' on page 29 undefined on input line 3187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SUCCESS:mpi4py.MPI.SUCCESS ' on page 29 undefined on input line 3194. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_LASTCODE:mpi4py.MPI.ER R_LASTCODE' on page 29 undefined on input line 3201. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_COMM:mpi4py.MPI.ERR_CO MM' on page 29 undefined on input line 3208. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_GROUP:mpi4py.MPI.ERR_G ROUP' on page 29 undefined on input line 3215. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_TYPE:mpi4py.MPI.ERR_TY PE' on page 29 undefined on input line 3222. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_REQUEST:mpi4py.MPI.ERR _REQUEST' on page 29 undefined on input line 3229. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_OP:mpi4py.MPI.ERR_OP' on page 29 undefined on input line 3236. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_ERRHANDLER:mpi4py.MPI. ERR_ERRHANDLER' on page 29 undefined on input line 3243. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_BUFFER:mpi4py.MPI.ERR_ BUFFER' on page 29 undefined on input line 3250. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_COUNT:mpi4py.MPI.ERR_C OUNT' on page 29 undefined on input line 3257. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_TAG:mpi4py.MPI.ERR_TAG ' on page 29 undefined on input line 3264. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RANK:mpi4py.MPI.ERR_RA NK' on page 29 undefined on input line 3271. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_ROOT:mpi4py.MPI.ERR_RO OT' on page 29 undefined on input line 3278. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_TRUNCATE:mpi4py.MPI.ER R_TRUNCATE' on page 29 undefined on input line 3285. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_IN_STATUS:mpi4py.MPI.E RR_IN_STATUS' on page 29 undefined on input line 3292. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_PENDING:mpi4py.MPI.ERR _PENDING' on page 29 undefined on input line 3299. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_TOPOLOGY:mpi4py.MPI.ER R_TOPOLOGY' on page 29 undefined on input line 3306. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_DIMS:mpi4py.MPI.ERR_DI MS' on page 29 undefined on input line 3313. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_ARG:mpi4py.MPI.ERR_ARG ' on page 29 undefined on input line 3320. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_OTHER:mpi4py.MPI.ERR_O THER' on page 29 undefined on input line 3327. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_UNKNOWN:mpi4py.MPI.ERR _UNKNOWN' on page 29 undefined on input line 3334. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_INTERN:mpi4py.MPI.ERR_ INTERN' on page 29 undefined on input line 3341. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_INFO:mpi4py.MPI.ERR_IN FO' on page 29 undefined on input line 3348. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_FILE:mpi4py.MPI.ERR_FI LE' on page 29 undefined on input line 3355. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_WIN:mpi4py.MPI.ERR_WIN ' on page 29 undefined on input line 3362. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_KEYVAL:mpi4py.MPI.ERR_ KEYVAL' on page 29 undefined on input line 3369. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_INFO_KEY:mpi4py.MPI.ER R_INFO_KEY' on page 29 undefined on input line 3376. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_INFO_VALUE:mpi4py.MPI. ERR_INFO_VALUE' on page 29 undefined on input line 3383. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_INFO_NOKEY:mpi4py.MPI. ERR_INFO_NOKEY' on page 29 undefined on input line 3390. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_ACCESS:mpi4py.MPI.ERR_ ACCESS' on page 29 undefined on input line 3397. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_AMODE:mpi4py.MPI.ERR_A MODE' on page 29 undefined on input line 3404. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_BAD_FILE:mpi4py.MPI.ER R_BAD_FILE' on page 29 undefined on input line 3411. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_FILE_EXISTS:mpi4py.MPI .ERR_FILE_EXISTS' on page 29 undefined on input line 3418. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_FILE_IN_USE:mpi4py.MPI .ERR_FILE_IN_USE' on page 29 undefined on input line 3425. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_NO_SPACE:mpi4py.MPI.ER R_NO_SPACE' on page 29 undefined on input line 3432. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_NO_SUCH_FILE:mpi4py.MP I.ERR_NO_SUCH_FILE' on page 29 undefined on input line 3439. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_IO:mpi4py.MPI.ERR_IO' on page 29 undefined on input line 3446. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_READ_ONLY:mpi4py.MPI.E RR_READ_ONLY' on page 29 undefined on input line 3453. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_CONVERSION:mpi4py.MPI. ERR_CONVERSION' on page 29 undefined on input line 3460. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_DUP_DATAREP:mpi4py.MPI .ERR_DUP_DATAREP' on page 29 undefined on input line 3467. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_UNSUPPORTED_DATAREP:mp i4py.MPI.ERR_UNSUPPORTED_DATAREP' on page 29 undefined on input line 3474. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_UNSUPPORTED_OPERATION: mpi4py.MPI.ERR_UNSUPPORTED_OPERATION' on page 29 undefined on input line 3481. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_NAME:mpi4py.MPI.ERR_NA ME' on page 29 undefined on input line 3488. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_NO_MEM:mpi4py.MPI.ERR_ NO_MEM' on page 29 undefined on input line 3495. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_NOT_SAME:mpi4py.MPI.ER R_NOT_SAME' on page 29 undefined on input line 3502. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_PORT:mpi4py.MPI.ERR_PO RT' on page 29 undefined on input line 3509. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_QUOTA:mpi4py.MPI.ERR_Q UOTA' on page 29 undefined on input line 3516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_SERVICE:mpi4py.MPI.ERR _SERVICE' on page 29 undefined on input line 3523. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_SPAWN:mpi4py.MPI.ERR_S PAWN' on page 29 undefined on input line 3530. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_BASE:mpi4py.MPI.ERR_BA SE' on page 29 undefined on input line 3537. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_SIZE:mpi4py.MPI.ERR_SI ZE' on page 29 undefined on input line 3544. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_DISP:mpi4py.MPI.ERR_DI SP' on page 29 undefined on input line 3551. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_ASSERT:mpi4py.MPI.ERR_ ASSERT' on page 29 undefined on input line 3558. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_LOCKTYPE:mpi4py.MPI.ER R_LOCKTYPE' on page 29 undefined on input line 3565. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RMA_CONFLICT:mpi4py.MP I.ERR_RMA_CONFLICT' on page 29 undefined on input line 3572. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RMA_SYNC:mpi4py.MPI.ER R_RMA_SYNC' on page 29 undefined on input line 3579. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RMA_RANGE:mpi4py.MPI.E RR_RMA_RANGE' on page 29 undefined on input line 3586. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RMA_ATTACH:mpi4py.MPI. ERR_RMA_ATTACH' on page 29 undefined on input line 3593. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RMA_SHARED:mpi4py.MPI. ERR_RMA_SHARED' on page 29 undefined on input line 3600. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RMA_FLAVOR:mpi4py.MPI. ERR_RMA_FLAVOR' on page 29 undefined on input line 3607. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ORDER_C:mpi4py.MPI.ORDER_C ' on page 29 undefined on input line 3614. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ORDER_F:mpi4py.MPI.ORDER_F ' on page 29 undefined on input line 3621. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ORDER_FORTRAN:mpi4py.MPI.O RDER_FORTRAN' on page 29 undefined on input line 3628. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.TYPECLASS_INTEGER:mpi4py.M PI.TYPECLASS_INTEGER' on page 29 undefined on input line 3635. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.TYPECLASS_REAL:mpi4py.MPI. TYPECLASS_REAL' on page 29 undefined on input line 3642. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.TYPECLASS_COMPLEX:mpi4py.M PI.TYPECLASS_COMPLEX' on page 29 undefined on input line 3649. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DISTRIBUTE_NONE:mpi4py.MPI .DISTRIBUTE_NONE' on page 29 undefined on input line 3656. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DISTRIBUTE_BLOCK:mpi4py.MP I.DISTRIBUTE_BLOCK' on page 29 undefined on input line 3663. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DISTRIBUTE_CYCLIC:mpi4py.M PI.DISTRIBUTE_CYCLIC' on page 29 undefined on input line 3670. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DISTRIBUTE_DFLT_DARG:mpi4p y.MPI.DISTRIBUTE_DFLT_DARG' on page 29 undefined on input line 3677. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_NAMED:mpi4py.MPI. COMBINER_NAMED' on page 29 undefined on input line 3684. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_DUP:mpi4py.MPI.CO MBINER_DUP' on page 29 undefined on input line 3691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_CONTIGUOUS:mpi4py .MPI.COMBINER_CONTIGUOUS' on page 29 undefined on input line 3698. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_VECTOR:mpi4py.MPI .COMBINER_VECTOR' on page 29 undefined on input line 3705. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_HVECTOR:mpi4py.MP I.COMBINER_HVECTOR' on page 29 undefined on input line 3712. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_INDEXED:mpi4py.MP I.COMBINER_INDEXED' on page 29 undefined on input line 3719. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_HINDEXED:mpi4py.M PI.COMBINER_HINDEXED' on page 29 undefined on input line 3726. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_INDEXED_BLOCK:mpi 4py.MPI.COMBINER_INDEXED_BLOCK' on page 29 undefined on input line 3733. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_HINDEXED_BLOCK:mp i4py.MPI.COMBINER_HINDEXED_BLOCK' on page 29 undefined on input line 3740. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_STRUCT:mpi4py.MPI .COMBINER_STRUCT' on page 29 undefined on input line 3747. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_SUBARRAY:mpi4py.M PI.COMBINER_SUBARRAY' on page 29 undefined on input line 3754. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_DARRAY:mpi4py.MPI .COMBINER_DARRAY' on page 29 undefined on input line 3761. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_RESIZED:mpi4py.MP I.COMBINER_RESIZED' on page 29 undefined on input line 3768. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_VALUE_INDEX:mpi4p y.MPI.COMBINER_VALUE_INDEX' on page 29 undefined on input line 3775. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_F90_REAL:mpi4py.M PI.COMBINER_F90_REAL' on page 29 undefined on input line 3782. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_F90_COMPLEX:mpi4p y.MPI.COMBINER_F90_COMPLEX' on page 29 undefined on input line 3789. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_F90_INTEGER:mpi4p y.MPI.COMBINER_F90_INTEGER' on page 29 undefined on input line 3796. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.IDENT:mpi4py.MPI.IDENT' on page 29 undefined on input line 3803. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CONGRUENT:mpi4py.MPI.CONGR UENT' on page 29 undefined on input line 3810. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SIMILAR:mpi4py.MPI.SIMILAR ' on page 29 undefined on input line 3817. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNEQUAL:mpi4py.MPI.UNEQUAL ' on page 29 undefined on input line 3824. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CART:mpi4py.MPI.CART' on p age 29 undefined on input line 3831. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.GRAPH:mpi4py.MPI.GRAPH' on page 29 undefined on input line 3838. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DIST_GRAPH:mpi4py.MPI.DIST _GRAPH' on page 29 undefined on input line 3845. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNWEIGHTED:mpi4py.MPI.UNWE IGHTED' on page 29 undefined on input line 3852. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WEIGHTS_EMPTY:mpi4py.MPI.W EIGHTS_EMPTY' on page 29 undefined on input line 3859. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_TYPE_SHARED:mpi4py.MP I.COMM_TYPE_SHARED' on page 29 undefined on input line 3866. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BSEND_OVERHEAD:mpi4py.MPI. BSEND_OVERHEAD' on page 29 undefined on input line 3873. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_FLAVOR_CREATE:mpi4py.M PI.WIN_FLAVOR_CREATE' on page 29 undefined on input line 3880. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_FLAVOR_ALLOCATE:mpi4py .MPI.WIN_FLAVOR_ALLOCATE' on page 29 undefined on input line 3887. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_FLAVOR_DYNAMIC:mpi4py. MPI.WIN_FLAVOR_DYNAMIC' on page 29 undefined on input line 3894. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_FLAVOR_SHARED:mpi4py.M PI.WIN_FLAVOR_SHARED' on page 29 undefined on input line 3901. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_SEPARATE:mpi4py.MPI.WI N_SEPARATE' on page 29 undefined on input line 3908. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_UNIFIED:mpi4py.MPI.WIN _UNIFIED' on page 29 undefined on input line 3915. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_NOCHECK:mpi4py.MPI.MO DE_NOCHECK' on page 29 undefined on input line 3922. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_NOSTORE:mpi4py.MPI.MO DE_NOSTORE' on page 29 undefined on input line 3929. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_NOPUT:mpi4py.MPI.MODE _NOPUT' on page 29 undefined on input line 3936. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_NOPRECEDE:mpi4py.MPI. MODE_NOPRECEDE' on page 29 undefined on input line 3943. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_NOSUCCEED:mpi4py.MPI. MODE_NOSUCCEED' on page 29 undefined on input line 3950. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOCK_EXCLUSIVE:mpi4py.MPI. LOCK_EXCLUSIVE' on page 29 undefined on input line 3957. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOCK_SHARED:mpi4py.MPI.LOC K_SHARED' on page 29 undefined on input line 3964. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_RDONLY:mpi4py.MPI.MOD E_RDONLY' on page 29 undefined on input line 3971. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_WRONLY:mpi4py.MPI.MOD E_WRONLY' on page 29 undefined on input line 3978. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_RDWR:mpi4py.MPI.MODE_ RDWR' on page 29 undefined on input line 3985. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_CREATE:mpi4py.MPI.MOD E_CREATE' on page 29 undefined on input line 3992. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_EXCL:mpi4py.MPI.MODE_ EXCL' on page 29 undefined on input line 3999. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_DELETE_ON_CLOSE:mpi4p y.MPI.MODE_DELETE_ON_CLOSE' on page 29 undefined on input line 4006. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_UNIQUE_OPEN:mpi4py.MP I.MODE_UNIQUE_OPEN' on page 29 undefined on input line 4013. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_SEQUENTIAL:mpi4py.MPI .MODE_SEQUENTIAL' on page 29 undefined on input line 4020. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_APPEND:mpi4py.MPI.MOD E_APPEND' on page 29 undefined on input line 4027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SEEK_SET:mpi4py.MPI.SEEK_S ET' on page 29 undefined on input line 4034. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SEEK_CUR:mpi4py.MPI.SEEK_C UR' on page 29 undefined on input line 4041. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SEEK_END:mpi4py.MPI.SEEK_E ND' on page 29 undefined on input line 4048. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DISPLACEMENT_CURRENT:mpi4p y.MPI.DISPLACEMENT_CURRENT' on page 29 undefined on input line 4055. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DISP_CUR:mpi4py.MPI.DISP_C UR' on page 29 undefined on input line 4062. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_SINGLE:mpi4py.MPI.T HREAD_SINGLE' on page 29 undefined on input line 4069. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_FUNNELED:mpi4py.MPI .THREAD_FUNNELED' on page 29 undefined on input line 4076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_SERIALIZED:mpi4py.M PI.THREAD_SERIALIZED' on page 29 undefined on input line 4083. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_MULTIPLE:mpi4py.MPI .THREAD_MULTIPLE' on page 29 undefined on input line 4090. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.VERSION:mpi4py.MPI.VERSION ' on page 29 undefined on input line 4097. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SUBVERSION:mpi4py.MPI.SUBV ERSION' on page 29 undefined on input line 4104. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_PROCESSOR_NAME:mpi4py. MPI.MAX_PROCESSOR_NAME' on page 29 undefined on input line 4111. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_ERROR_STRING:mpi4py.MP I.MAX_ERROR_STRING' on page 29 undefined on input line 4118. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_PORT_NAME:mpi4py.MPI.M AX_PORT_NAME' on page 29 undefined on input line 4125. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_INFO_KEY:mpi4py.MPI.MA X_INFO_KEY' on page 29 undefined on input line 4132. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_INFO_VAL:mpi4py.MPI.MA X_INFO_VAL' on page 29 undefined on input line 4139. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_OBJECT_NAME:mpi4py.MPI .MAX_OBJECT_NAME' on page 29 undefined on input line 4146. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_DATAREP_STRING:mpi4py. MPI.MAX_DATAREP_STRING' on page 29 undefined on input line 4153. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_LIBRARY_VERSION_STRING :mpi4py.MPI.MAX_LIBRARY_VERSION_STRING' on page 29 undefined on input line 4160 . LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DATATYPE_NULL:mpi4py.MPI.D ATATYPE_NULL' on page 29 undefined on input line 4167. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4170. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.PACKED:mpi4py.MPI.PACKED' on page 29 undefined on input line 4174. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4177. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BYTE:mpi4py.MPI.BYTE' on p age 29 undefined on input line 4181. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4184. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.AINT:mpi4py.MPI.AINT' on p age 29 undefined on input line 4188. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4191. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.OFFSET:mpi4py.MPI.OFFSET' on page 29 undefined on input line 4195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4198. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COUNT:mpi4py.MPI.COUNT' on page 29 undefined on input line 4202. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4205. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CHAR:mpi4py.MPI.CHAR' on p age 29 undefined on input line 4209. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4212. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WCHAR:mpi4py.MPI.WCHAR' on page 29 undefined on input line 4216. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4219. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SIGNED_CHAR:mpi4py.MPI.SIG NED_CHAR' on page 29 undefined on input line 4223. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4226. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SHORT:mpi4py.MPI.SHORT' on page 29 undefined on input line 4230. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4233. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INT:mpi4py.MPI.INT' on pag e 29 undefined on input line 4237. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4240. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LONG:mpi4py.MPI.LONG' on p age 29 undefined on input line 4244. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4247. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LONG_LONG:mpi4py.MPI.LONG_ LONG' on page 29 undefined on input line 4251. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4254. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNSIGNED_CHAR:mpi4py.MPI.U NSIGNED_CHAR' on page 29 undefined on input line 4258. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4261. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNSIGNED_SHORT:mpi4py.MPI. UNSIGNED_SHORT' on page 29 undefined on input line 4265. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4268. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNSIGNED:mpi4py.MPI.UNSIGN ED' on page 29 undefined on input line 4272. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4275. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNSIGNED_LONG:mpi4py.MPI.U NSIGNED_LONG' on page 29 undefined on input line 4279. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4282. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNSIGNED_LONG_LONG:mpi4py. MPI.UNSIGNED_LONG_LONG' on page 29 undefined on input line 4286. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4289. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.FLOAT:mpi4py.MPI.FLOAT' on page 29 undefined on input line 4293. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4296. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DOUBLE:mpi4py.MPI.DOUBLE' on page 29 undefined on input line 4300. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4303. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LONG_DOUBLE:mpi4py.MPI.LON G_DOUBLE' on page 29 undefined on input line 4307. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4310. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.C_BOOL:mpi4py.MPI.C_BOOL' on page 29 undefined on input line 4314. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4317. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INT8_T:mpi4py.MPI.INT8_T' on page 29 undefined on input line 4321. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4324. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INT16_T:mpi4py.MPI.INT16_T ' on page 29 undefined on input line 4328. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4331. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INT32_T:mpi4py.MPI.INT32_T ' on page 29 undefined on input line 4335. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4338. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INT64_T:mpi4py.MPI.INT64_T ' on page 29 undefined on input line 4342. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4345. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UINT8_T:mpi4py.MPI.UINT8_T ' on page 29 undefined on input line 4349. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4352. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UINT16_T:mpi4py.MPI.UINT16 _T' on page 29 undefined on input line 4356. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4359. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UINT32_T:mpi4py.MPI.UINT32 _T' on page 29 undefined on input line 4363. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UINT64_T:mpi4py.MPI.UINT64 _T' on page 29 undefined on input line 4370. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4373. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.C_COMPLEX:mpi4py.MPI.C_COM PLEX' on page 29 undefined on input line 4377. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4380. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.C_FLOAT_COMPLEX:mpi4py.MPI .C_FLOAT_COMPLEX' on page 29 undefined on input line 4384. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4387. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.C_DOUBLE_COMPLEX:mpi4py.MP I.C_DOUBLE_COMPLEX' on page 29 undefined on input line 4391. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4394. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.C_LONG_DOUBLE_COMPLEX:mpi4 py.MPI.C_LONG_DOUBLE_COMPLEX' on page 29 undefined on input line 4398. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4401. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CXX_BOOL:mpi4py.MPI.CXX_BO OL' on page 29 undefined on input line 4405. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4408. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CXX_FLOAT_COMPLEX:mpi4py.M PI.CXX_FLOAT_COMPLEX' on page 29 undefined on input line 4412. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4415. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CXX_DOUBLE_COMPLEX:mpi4py. MPI.CXX_DOUBLE_COMPLEX' on page 29 undefined on input line 4419. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4422. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CXX_LONG_DOUBLE_COMPLEX:mp i4py.MPI.CXX_LONG_DOUBLE_COMPLEX' on page 29 undefined on input line 4426. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4429. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SHORT_INT:mpi4py.MPI.SHORT _INT' on page 29 undefined on input line 4433. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4436. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INT_INT:mpi4py.MPI.INT_INT ' on page 29 undefined on input line 4440. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4443. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.TWOINT:mpi4py.MPI.TWOINT' on page 29 undefined on input line 4447. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4450. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LONG_INT:mpi4py.MPI.LONG_I NT' on page 29 undefined on input line 4454. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4457. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.FLOAT_INT:mpi4py.MPI.FLOAT _INT' on page 29 undefined on input line 4461. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4464. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DOUBLE_INT:mpi4py.MPI.DOUB LE_INT' on page 29 undefined on input line 4468. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4471. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LONG_DOUBLE_INT:mpi4py.MPI .LONG_DOUBLE_INT' on page 29 undefined on input line 4475. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4478. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CHARACTER:mpi4py.MPI.CHARA CTER' on page 29 undefined on input line 4482. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4485. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOGICAL:mpi4py.MPI.LOGICAL ' on page 29 undefined on input line 4489. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4492. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INTEGER:mpi4py.MPI.INTEGER ' on page 29 undefined on input line 4496. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4499. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REAL:mpi4py.MPI.REAL' on p age 29 undefined on input line 4503. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4506. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DOUBLE_PRECISION:mpi4py.MP I.DOUBLE_PRECISION' on page 29 undefined on input line 4510. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4513. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMPLEX:mpi4py.MPI.COMPLEX ' on page 29 undefined on input line 4517. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4520. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DOUBLE_COMPLEX:mpi4py.MPI. DOUBLE_COMPLEX' on page 29 undefined on input line 4524. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4527. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOGICAL1:mpi4py.MPI.LOGICA L1' on page 29 undefined on input line 4531. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4534. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOGICAL2:mpi4py.MPI.LOGICA L2' on page 29 undefined on input line 4538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4541. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOGICAL4:mpi4py.MPI.LOGICA L4' on page 29 undefined on input line 4545. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4548. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOGICAL8:mpi4py.MPI.LOGICA L8' on page 29 undefined on input line 4552. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4555. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INTEGER1:mpi4py.MPI.INTEGE R1' on page 29 undefined on input line 4559. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4562. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INTEGER2:mpi4py.MPI.INTEGE R2' on page 29 undefined on input line 4566. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4569. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INTEGER4:mpi4py.MPI.INTEGE R4' on page 29 undefined on input line 4573. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4576. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INTEGER8:mpi4py.MPI.INTEGE R8' on page 29 undefined on input line 4580. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4583. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INTEGER16:mpi4py.MPI.INTEG ER16' on page 29 undefined on input line 4587. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4590. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REAL2:mpi4py.MPI.REAL2' on page 29 undefined on input line 4594. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4597. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REAL4:mpi4py.MPI.REAL4' on page 29 undefined on input line 4601. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4604. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REAL8:mpi4py.MPI.REAL8' on page 29 undefined on input line 4608. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4611. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REAL16:mpi4py.MPI.REAL16' on page 29 undefined on input line 4615. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4618. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMPLEX4:mpi4py.MPI.COMPLE X4' on page 29 undefined on input line 4622. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4625. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMPLEX8:mpi4py.MPI.COMPLE X8' on page 29 undefined on input line 4629. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4632. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMPLEX16:mpi4py.MPI.COMPL EX16' on page 29 undefined on input line 4636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4639. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMPLEX32:mpi4py.MPI.COMPL EX32' on page 29 undefined on input line 4643. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4646. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNSIGNED_INT:mpi4py.MPI.UN SIGNED_INT' on page 29 undefined on input line 4650. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4653. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SIGNED_SHORT:mpi4py.MPI.SI GNED_SHORT' on page 29 undefined on input line 4657. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4660. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SIGNED_INT:mpi4py.MPI.SIGN ED_INT' on page 29 undefined on input line 4664. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4667. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SIGNED_LONG:mpi4py.MPI.SIG NED_LONG' on page 29 undefined on input line 4671. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4674. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SIGNED_LONG_LONG:mpi4py.MP I.SIGNED_LONG_LONG' on page 29 undefined on input line 4678. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4681. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BOOL:mpi4py.MPI.BOOL' on p age 29 undefined on input line 4685. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4688. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SINT8_T:mpi4py.MPI.SINT8_T ' on page 29 undefined on input line 4692. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4695. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SINT16_T:mpi4py.MPI.SINT16 _T' on page 29 undefined on input line 4699. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4702. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SINT32_T:mpi4py.MPI.SINT32 _T' on page 29 undefined on input line 4706. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4709. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SINT64_T:mpi4py.MPI.SINT64 _T' on page 29 undefined on input line 4713. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4716. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_BOOL:mpi4py.MPI.F_BOOL' on page 29 undefined on input line 4720. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4723. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_INT:mpi4py.MPI.F_INT' on page 29 undefined on input line 4727. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4730. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_FLOAT:mpi4py.MPI.F_FLOAT ' on page 29 undefined on input line 4734. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4737. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_DOUBLE:mpi4py.MPI.F_DOUB LE' on page 29 undefined on input line 4741. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4744. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_COMPLEX:mpi4py.MPI.F_COM PLEX' on page 29 undefined on input line 4748. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4751. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_FLOAT_COMPLEX:mpi4py.MPI .F_FLOAT_COMPLEX' on page 29 undefined on input line 4755. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4758. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_DOUBLE_COMPLEX:mpi4py.MP I.F_DOUBLE_COMPLEX' on page 29 undefined on input line 4762. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 29 undefined on input line 4765. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REQUEST_NULL:mpi4py.MPI.RE QUEST_NULL' on page 29 undefined on input line 4769. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 29 undefined on input line 4772. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MESSAGE_NULL:mpi4py.MPI.ME SSAGE_NULL' on page 29 undefined on input line 4776. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 29 undefined on input line 4779. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MESSAGE_NO_PROC:mpi4py.MPI .MESSAGE_NO_PROC' on page 29 undefined on input line 4783. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 29 undefined on input line 4786. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.OP_NULL:mpi4py.MPI.OP_NULL ' on page 29 undefined on input line 4790. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4793. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX:mpi4py.MPI.MAX' on pag e 29 undefined on input line 4797. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4800. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MIN:mpi4py.MPI.MIN' on pag e 29 undefined on input line 4804. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4807. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SUM:mpi4py.MPI.SUM' on pag e 29 undefined on input line 4811. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4814. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.PROD:mpi4py.MPI.PROD' on p age 29 undefined on input line 4818. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4821. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LAND:mpi4py.MPI.LAND' on p age 29 undefined on input line 4825. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4828. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BAND:mpi4py.MPI.BAND' on p age 29 undefined on input line 4832. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4835. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOR:mpi4py.MPI.LOR' on pag e 29 undefined on input line 4839. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BOR:mpi4py.MPI.BOR' on pag e 29 undefined on input line 4846. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4849. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LXOR:mpi4py.MPI.LXOR' on p age 29 undefined on input line 4853. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4856. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BXOR:mpi4py.MPI.BXOR' on p age 29 undefined on input line 4860. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4863. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAXLOC:mpi4py.MPI.MAXLOC' on page 29 undefined on input line 4867. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4870. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MINLOC:mpi4py.MPI.MINLOC' on page 29 undefined on input line 4874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4877. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REPLACE:mpi4py.MPI.REPLACE ' on page 29 undefined on input line 4881. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4884. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.NO_OP:mpi4py.MPI.NO_OP' on page 29 undefined on input line 4888. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 29 undefined on input line 4891. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.GROUP_NULL:mpi4py.MPI.GROU P_NULL' on page 29 undefined on input line 4895. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 29 undefined on input line 4898. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.GROUP_EMPTY:mpi4py.MPI.GRO UP_EMPTY' on page 29 undefined on input line 4902. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 29 undefined on input line 4905. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INFO_NULL:mpi4py.MPI.INFO_ NULL' on page 29 undefined on input line 4909. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 29 undefined on input line 4912. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INFO_ENV:mpi4py.MPI.INFO_E NV' on page 29 undefined on input line 4916. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 29 undefined on input line 4919. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRHANDLER_NULL:mpi4py.MPI .ERRHANDLER_NULL' on page 29 undefined on input line 4923. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 29 undefined on input line 4926. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRORS_RETURN:mpi4py.MPI.E RRORS_RETURN' on page 29 undefined on input line 4930. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 29 undefined on input line 4933. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRORS_ARE_FATAL:mpi4py.MP I.ERRORS_ARE_FATAL' on page 29 undefined on input line 4937. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 29 undefined on input line 4940. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_NULL:mpi4py.MPI.COMM_ NULL' on page 29 undefined on input line 4944. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 29 undefined on input line 4947. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_SELF:mpi4py.MPI.COMM_ SELF' on page 29 undefined on input line 4951. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 29 undefined on input line 4954. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_WORLD:mpi4py.MPI.COMM _WORLD' on page 29 undefined on input line 4958. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 29 undefined on input line 4961. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_NULL:mpi4py.MPI.WIN_NU LL' on page 29 undefined on input line 4965. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 29 undefined on input line 4968. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.FILE_NULL:mpi4py.MPI.FILE_ NULL' on page 29 undefined on input line 4972. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 29 undefined on input line 4975. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.pickle:mpi4py.MPI.pickle' on page 29 undefined on input line 4979. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle' on page 29 undefined on input line 4982. [29] [30] [31] [32] Package longtable Warning: Column widths have changed (longtable) in table 1 on input line 4985. [33] [34] LaTeX Warning: Hyper reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 35 undefined on input line 5001. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Bottom' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Aint' on page 35 un defined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 35 u ndefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 35 u ndefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Offset' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TypeSpec' on page 3 5 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 3 5 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 3 5 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 3 5 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TargetSpec' on page 35 undefined on input line 5137. Package tabulary Warning: No suitable columns! on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Bottom' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Aint' on page 35 un defined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 35 u ndefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 35 u ndefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Offset' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TypeSpec' on page 3 5 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 35 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 3 5 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 3 5 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 3 5 undefined on input line 5137. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TargetSpec' on page 35 undefined on input line 5137. [35] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 36 undefined on input line 5218. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 36 undefined on input line 5218. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 36 undefined on input line 5218. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BottomType:mpi4py.MPI.Bott omType' on page 36 undefined on input line 5233. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.InPlaceType:mpi4py.MPI.InP laceType' on page 36 undefined on input line 5248. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 36 undefined on input line 5323. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 36 undefined on input line 5339. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 36 undefined on input line 5343. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 36 u ndefined on input line 5343. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 36 undefined on input line 5347. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TypeSpec' on page 3 6 undefined on input line 5347. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 36 undefined on input line 5351. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 36 u ndefined on input line 5351. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TypeSpec' on page 3 6 undefined on input line 5351. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Bottom' on page 36 undefined on input line 5355. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 36 u ndefined on input line 5355. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 36 undefined on input line 5355. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 36 undefined on input line 5360. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 36 undefined on input line 5360. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 36 undefined on input line 5360. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 36 undefined on input line 5360. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 36 undefined on input line 5360. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 36 undefined on input line 5360. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 36 undefined on input line 5360. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 36 undefined on input line 5360. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 36 undefined on input line 5360. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 36 undefined on input line 5360. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 36 undefined on input line 5360. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 36 undefined on input line 5360. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 36 undefined on input line 5360. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 36 undefined on input line 5360. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BottomType:mpi4py.MPI.Bott omType' on page 36 undefined on input line 5360. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 36 undefined on input line 5360. [36] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5376. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5380. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5380. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5384. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TypeSpec' on page 3 7 undefined on input line 5384. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5388. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5388. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TypeSpec' on page 3 7 undefined on input line 5388. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5393. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5393. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5409. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5413. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5413. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5417. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5417. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 37 u ndefined on input line 5417. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5421. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TypeSpec' on page 3 7 undefined on input line 5421. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5425. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5425. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TypeSpec' on page 3 7 undefined on input line 5425. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5429. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5429. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 37 u ndefined on input line 5429. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TypeSpec' on page 3 7 undefined on input line 5429. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5433. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5433. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 37 u ndefined on input line 5433. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TypeSpec' on page 3 7 undefined on input line 5433. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Bottom' on page 37 undefined on input line 5437. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5437. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 37 u ndefined on input line 5437. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5437. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Bottom' on page 37 undefined on input line 5441. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5441. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 37 u ndefined on input line 5441. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5441. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5446. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5446. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5446. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5446. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BottomType:mpi4py.MPI.Bott omType' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BottomType:mpi4py.MPI.Bott omType' on page 37 undefined on input line 5446. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5446. Underfull \hbox (badness 10000) in paragraph at lines 5445--5447 [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T 1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/ n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ]]] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [\T1/txtt/m/sl/10 SupportsBu ffer \T1/qtm/m/n/10 | \T1/txtt/m/sl/10 SupportsDLPack Underfull \hbox (badness 6188) in paragraph at lines 5445--5447 \T1/qtm/m/n/10 | \T1/txtt/m/sl/10 SupportsCAI\T1/qtm/m/n/10 , \T1/txtt/m/sl/10 Datatype \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 str[][]\T1/qtm/m/n/10 ] | [][]\T1 /txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [\T1/txtt/m/sl/10 SupportsBuffer \T1/qtm/m /n/10 | \T1/txtt/m/sl/10 SupportsDLPack \T1/qtm/m/n/10 | \T1/txtt/m/sl/10 Suppo rtsCAI\T1/qtm/m/n/10 , Underfull \hbox (badness 6268) in paragraph at lines 5445--5447 [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T 1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/ n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ]], \T1/txtt/m/sl/10 Datatype \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 str[][]\T1/ qtm/m/n/10 ] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [\T1/txtt/m/sl/10 Su pportsBuffer \T1/qtm/m/n/10 | Underfull \hbox (badness 8151) in paragraph at lines 5445--5447 \T1/txtt/m/sl/10 SupportsDLPack \T1/qtm/m/n/10 | \T1/txtt/m/sl/10 SupportsCAI\T 1/qtm/m/n/10 , [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n /10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/1 0 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], \T1/txtt/m/sl/10 Datatype \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 str[][]\T1/qtm/m/n/10 ] Underfull \hbox (badness 10000) in paragraph at lines 5445--5447 \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [\T1/txtt/m/sl/10 BottomType \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 None[][]\T1/qtm/m/n/10 , [][]\T 1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/ m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/n/10 S equence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ]], \ T1/txtt/m/sl/10 Datatype\T1/qtm/m/n/10 ] | LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5462. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5466. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5466. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 37 u ndefined on input line 5466. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5466. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 37 undefined on input line 5470. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5470. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 37 u ndefined on input line 5470. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5470. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Bottom' on page 37 undefined on input line 5474. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5474. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 37 u ndefined on input line 5474. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5474. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Bottom' on page 37 undefined on input line 5478. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 37 u ndefined on input line 5478. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 37 u ndefined on input line 5478. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5478. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5483. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5483. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5483. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5483. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5483. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5483. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5483. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5483. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsBuffer' on page 37 undefined on input line 5483. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsDLPack' on page 37 undefined on input line 5483. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.SupportsCAI' on pag e 37 undefined on input line 5483. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5483. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BottomType:mpi4py.MPI.Bott omType' on page 37 undefined on input line 5483. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5483. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BottomType:mpi4py.MPI.Bott omType' on page 37 undefined on input line 5483. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 37 undefined on input line 5483. Underfull \hbox (badness 10000) in paragraph at lines 5482--5484 []\T1/qtm/m/n/10 alias of [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [\T1/txtt /m/sl/10 SupportsBuffer \T1/qtm/m/n/10 | \T1/txtt/m/sl/10 SupportsDLPack \T1/qt m/m/n/10 | \T1/txtt/m/sl/10 SupportsCAI\T1/qtm/m/n/10 , [][]\T1/txtt/m/n/10 Seq uence[][]\T1/qtm/m/n/10 [\T1/txtt/m/sl/10 Datatype\T1/qtm/m/n/10 ]] Underfull \hbox (badness 10000) in paragraph at lines 5482--5484 \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [\T1/txtt/m/sl/10 SupportsBuffer \T1/qtm/m/n/10 | \T1/txtt/m/sl/10 SupportsDLPack \T1/qtm/m/n/10 | \T1/txtt/m/sl/10 SupportsCAI\T1/qtm/m/n/10 , [][]\T1/txtt/m/n/10 Tuple[][]\T1 /qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/1 0 Integral[][]\T1/qtm/m/n/10 ], Underfull \hbox (badness 10000) in paragraph at lines 5482--5484 [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][ ]\T1/qtm/m/n/10 ]], [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [\T1/txtt/m/ sl/10 Datatype\T1/qtm/m/n/10 ]] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [ \T1/txtt/m/sl/10 SupportsBuffer \T1/qtm/m/n/10 | \T1/txtt/m/sl/10 SupportsDLPac k \T1/qtm/m/n/10 | [37] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 38 u ndefined on input line 5499. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 38 u ndefined on input line 5507. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 38 u ndefined on input line 5511. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 38 u ndefined on input line 5511. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Displ' on page 38 u ndefined on input line 5515. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Count' on page 38 u ndefined on input line 5515. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 38 undefined on input line 5515. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 38 undefined on input line 5520. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 3 8 undefined on input line 5587. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 3 8 undefined on input line 5588. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 38 undefined on input line 5589. [38] LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 39 undefined on input line 5616. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 39 undefined on input line 5620. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 39 undefined on input line 5624. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Spawn' on page 39 undefined on input line 5626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_SELF:mpi4py.MPI.COMM_ SELF' on page 39 undefined on input line 5626. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 39 undefined on input line 5629. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 3 9 undefined on input line 5638. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 39 undefined on input line 5648. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 3 9 undefined on input line 5650. LaTeX Warning: Hyper reference `mpi4py.futures:envvar-MPI4PY_FUTURES_MAX_WORKER S' on page 39 undefined on input line 5667. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 39 undefined on input line 5697. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Spawn' on page 39 undefined on input line 5698. [39] LaTeX Warning: Hyper reference `mpi4py.futures:envvar-MPI4PY_FUTURES_USE_PKL5' on page 40 undefined on input line 5739. LaTeX Warning: Hyper reference `mpi4py.util.pkl5:module-mpi4py.util.pkl5' on pa ge 40 undefined on input line 5742. LaTeX Warning: Hyper reference `mpi4py.futures:envvar-MPI4PY_FUTURES_BACKOFF' o n page 40 undefined on input line 5749. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.m ap' on page 40 undefined on input line 5789. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.m ap' on page 40 undefined on input line 5819. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.m ap' on page 40 undefined on input line 5821. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.s tarmap' on page 40 undefined on input line 5822. [40] LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.s ubmit' on page 41 undefined on input line 5843. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.m ap' on page 41 undefined on input line 5843. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.s hutdown' on page 41 undefined on input line 5844. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.s hutdown' on page 41 undefined on input line 5864. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.b ootup' on page 41 undefined on input line 5886. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.s ubmit' on page 41 undefined on input line 5888. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.b ootup' on page 41 undefined on input line 5889. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 41 undefined on input line 5916. LaTeX Warning: Hyper reference `mpi4py.futures:envvar-MPI4PY_FUTURES_MAX_WORKER S' on page 41 undefined on input line 5917. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 41 undefined on input line 5934. LaTeX Warning: Hyper reference `mpi4py.futures:envvar-MPI4PY_FUTURES_USE_PKL5' on page 41 undefined on input line 5935. LaTeX Warning: Hyper reference `mpi4py.util.pkl5:module-mpi4py.util.pkl5' on pa ge 41 undefined on input line 5942. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 41 undefined on input line 5957. LaTeX Warning: Hyper reference `mpi4py.futures:envvar-MPI4PY_FUTURES_BACKOFF' o n page 41 undefined on input line 5958. [41] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_MULTIPLE:mpi4py.MPI .THREAD_MULTIPLE' on page 42 undefined on input line 5976. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_MULTIPLE:mpi4py.MPI .THREAD_MULTIPLE' on page 42 undefined on input line 5980. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 2 undefined on input line 5980. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_SERIALIZED:mpi4py.M PI.THREAD_SERIALIZED' on page 42 undefined on input line 5982. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 2 undefined on input line 5982. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_SERIALIZED:mpi4py.M PI.THREAD_SERIALIZED' on page 42 undefined on input line 5989. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_SINGLE:mpi4py.MPI.T HREAD_SINGLE' on page 42 undefined on input line 5989. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_FUNNELED:mpi4py.MPI .THREAD_FUNNELED' on page 42 undefined on input line 5990. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 2 undefined on input line 5990. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 2 undefined on input line 5991. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 2 undefined on input line 6007. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPICommExecutor' on page 42 undefined on input line 6011. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 42 undefined on input line 6013. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPICommExecutor' on page 42 undefined on input line 6019. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 2 undefined on input line 6022. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 42 undefined on input line 6031. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_WORLD:mpi4py.MPI.COMM _WORLD' on page 42 undefined on input line 6032. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 42 undefined on input line 6036. Overfull \vbox (0.54955pt too high) detected at line 6049 [42] LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPICommExecutor' on page 43 undefined on input line 6056. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_SELF:mpi4py.MPI.COMM_ SELF' on page 43 undefined on input line 6057. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPICommExecutor' on page 43 undefined on input line 6065. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 3 undefined on input line 6075. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 3 undefined on input line 6077. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 3 undefined on input line 6105. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_WORLD:mpi4py.MPI.COMM _WORLD' on page 43 undefined on input line 6106. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_WORLD:mpi4py.MPI.COMM _WORLD' on page 43 undefined on input line 6106. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 43 undefined on input line 6109. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 3 undefined on input line 6119. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 43 undefined on input line 6122. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.m ap' on page 43 undefined on input line 6126. [43] LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 4 undefined on input line 6148. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 44 undefined on input line 6150. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 4 undefined on input line 6167. LaTeX Warning: Hyper reference `mpi4py.futures:cpi-py' on page 44 undefined on input line 6168. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.n um_workers' on page 44 undefined on input line 6172. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.s ubmit' on page 44 undefined on input line 6174. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.get_comm_workers' on page 44 undefined on input line 6178. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Barri er' on page 44 undefined on input line 6180. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.s ubmit' on page 44 undefined on input line 6189. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 4 undefined on input line 6200. [44] LaTeX Warning: Hyper reference `mpi4py.futures:julia-py' on page 45 undefined o n input line 6278. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 45 undefined on input line 6280. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 5 undefined on input line 6280. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor.m ap' on page 45 undefined on input line 6284. [45] LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 6 undefined on input line 6342. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 6 undefined on input line 6374. LaTeX Warning: Hyper reference `mpi4py.futures:envvar-MPI4PY_FUTURES_MAX_WORKER S' on page 46 undefined on input line 6375. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 6 undefined on input line 6387. LaTeX Warning: Hyper reference `mpi4py.futures:cpi-py' on page 46 undefined on input line 6410. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 6 undefined on input line 6411. LaTeX Warning: Hyper reference `mpi4py.futures:parallel-tasks' on page 46 undef ined on input line 6413. [46] [47] LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 4 8 undefined on input line 6495. LaTeX Warning: Citation `mpi4py.futures:id4' on page 48 undefined on input line 6497. LaTeX Warning: Hyper reference `mpi4py.util:module-mpi4py.util' on page 48 unde fined on input line 6508. LaTeX Warning: Hyper reference `mpi4py.util.dtlib:module-mpi4py.util.dtlib' on page 48 undefined on input line 6520. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 48 undefined on input line 6538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 48 undefined on input line 6556. [48] LaTeX Warning: Hyper reference `mpi4py.util.pkl5:module-mpi4py.util.pkl5' on pa ge 49 undefined on input line 6589. LaTeX Warning: Hyper reference `mpi4py.util.pkl5:mpi4py.util.pkl5.Request' on p age 49 undefined on input line 6623. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 49 undefined on input line 6623. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 49 undefined on input line 6691. [49] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 50 undefined on input line 6713. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 50 undefined on input line 6735. LaTeX Warning: Hyper reference `mpi4py.util.pkl5:mpi4py.util.pkl5.Message' on p age 50 undefined on input line 6808. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 50 undefined on input line 6808. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 50 undefined on input line 6840. [50] LaTeX Warning: Hyper reference `mpi4py.util.pkl5:mpi4py.util.pkl5.Request' on p age 51 undefined on input line 6862. [51] LaTeX Warning: Hyper reference `mpi4py.util.pkl5:mpi4py.util.pkl5.Request' on p age 52 undefined on input line 7039. LaTeX Warning: Hyper reference `mpi4py.util.pkl5:mpi4py.util.pkl5.Request' on p age 52 undefined on input line 7072. LaTeX Warning: Hyper reference `mpi4py.util.pkl5:mpi4py.util.pkl5.Request' on p age 52 undefined on input line 7105. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 52 undefined on input line 7124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 52 undefined on input line 7136. [52] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 53 undefined on input line 7167. LaTeX Warning: Hyper reference `mpi4py.util.pkl5:mpi4py.util.pkl5.Request' on p age 53 undefined on input line 7181. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 53 undefined on input line 7212. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 53 undefined on input line 7224. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 53 undefined on input line 7257. LaTeX Warning: Hyper reference `mpi4py.util.pkl5:mpi4py.util.pkl5.Message' on p age 53 undefined on input line 7263. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 53 undefined on input line 7290. LaTeX Warning: Hyper reference `mpi4py.util.pkl5:mpi4py.util.pkl5.Message' on p age 53 undefined on input line 7296. [53] [54] [55] LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool' on page 56 undefined on input line 7551. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 56 undefined on input line 7552. LaTeX Warning: Hyper reference `mpi4py.futures:module-mpi4py.futures' on page 5 6 undefined on input line 7560. LaTeX Warning: Hyper reference `mpi4py.futures:mpi4py.futures.MPIPoolExecutor' on page 56 undefined on input line 7612. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 56 undef ined on input line 7641. [56] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 57 undef ined on input line 7655. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.apply' o n page 57 undefined on input line 7669. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.ApplyResult' on page 57 undefined on input line 7669. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 57 undef ined on input line 7674. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 57 undef ined on input line 7686. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.AsyncResult' on page 57 undefined on input line 7696. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 57 undef ined on input line 7696. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.imap' on page 57 undefined on input line 7724. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.imap_uno rdered' on page 57 undefined on input line 7724. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.S' on page 57 undef ined on input line 7730. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 57 undef ined on input line 7730. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.S' on page 57 undef ined on input line 7734. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 57 undef ined on input line 7744. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.map' on page 57 undefined on input line 7758. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.MapResult' on page 57 undefined on input line 7758. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.S' on page 57 undef ined on input line 7763. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 57 undef ined on input line 7763. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.S' on page 57 undef ined on input line 7767. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 57 undef ined on input line 7775. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.MapResult' on page 57 undefined on input line 7785. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 57 undef ined on input line 7785. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.map' on page 57 undefined on input line 7799. [57] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.S' on page 58 undef ined on input line 7807. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 58 undef ined on input line 7807. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.S' on page 58 undef ined on input line 7811. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 58 undef ined on input line 7821. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.imap' on page 58 undefined on input line 7835. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.S' on page 58 undef ined on input line 7840. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 58 undef ined on input line 7840. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.S' on page 58 undef ined on input line 7844. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 58 undef ined on input line 7854. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.istarmap ' on page 58 undefined on input line 7882. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.istarmap _unordered' on page 58 undefined on input line 7882. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 58 undef ined on input line 7888. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 58 undef ined on input line 7902. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.starmap' on page 58 undefined on input line 7916. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.MapResult' on page 58 undefined on input line 7916. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 58 undef ined on input line 7921. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 58 undef ined on input line 7933. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.MapResult' on page 58 undefined on input line 7943. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 58 undef ined on input line 7943. [58] LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.starmap' on page 59 undefined on input line 7957. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 59 undef ined on input line 7965. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 59 undef ined on input line 7979. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.istarmap ' on page 59 undefined on input line 7993. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 59 undef ined on input line 7998. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 59 undef ined on input line 8012. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool' on page 59 undefined on input line 8083. [59] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 60 undef ined on input line 8123. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.AsyncResult' on page 60 undefined on input line 8201. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.apply_as ync' on page 60 undefined on input line 8204. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.AsyncResult' on page 60 undefined on input line 8216. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.map_asyn c' on page 60 undefined on input line 8219. LaTeX Warning: Hyper reference `mpi4py.util.pool:mpi4py.util.pool.Pool.starmap_ async' on page 60 undefined on input line 8219. LaTeX Warning: Hyper reference `mpi4py.util.sync:module-mpi4py.util.sync' on pa ge 60 undefined on input line 8233. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 60 undefined on input line 8268. [60] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 61 undefined on input line 8409. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 61 undefined on input line 8413. [61] LaTeX Warning: Hyper reference `mpi4py.util.sync:mpi4py.util.sync.Mutex' on pag e 62 undefined on input line 8535. LaTeX Warning: Citation `mpi4py.util.sync:mcs-paper' on page 62 undefined on in put line 8554. LaTeX Warning: Citation `mpi4py.util.sync:uam-book' on page 62 undefined on inp ut line 8555. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 62 undefined on input line 8569. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 62 undefined on input line 8577. [62] [63] LaTeX Warning: Hyper reference `mpi4py.util.sync:mpi4py.util.sync.Mutex' on pag e 64 undefined on input line 8769. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 64 undefined on input line 8777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 64 undefined on input line 8781. [64] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 65 undef ined on input line 8925. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.T' on page 65 undef ined on input line 8933. [65] LaTeX Warning: Hyper reference `mpi4py.util.sync:mpi4py.util.sync.Semaphore.acq uire' on page 66 undefined on input line 9023. LaTeX Warning: Hyper reference `mpi4py.util.sync:mpi4py.util.sync.Semaphore.rel ease' on page 66 undefined on input line 9023. LaTeX Warning: Hyper reference `mpi4py.util.sync:mpi4py.util.sync.Semaphore.acq uire' on page 66 undefined on input line 9024. LaTeX Warning: Hyper reference `mpi4py.util.sync:mpi4py.util.sync.Semaphore.rel ease' on page 66 undefined on input line 9025. LaTeX Warning: Hyper reference `mpi4py.util.sync:mpi4py.util.sync.Semaphore.rel ease' on page 66 undefined on input line 9031. LaTeX Warning: Hyper reference `mpi4py.util.sync:mpi4py.util.sync.Counter' on p age 66 undefined on input line 9034. LaTeX Warning: Hyper reference `mpi4py.util.sync:mpi4py.util.sync.Condition' on page 66 undefined on input line 9034. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 66 undefined on input line 9057. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 66 undefined on input line 9061. [66] [67] LaTeX Warning: Hyper reference `mpi4py:module-mpi4py' on page 68 undefined on i nput line 9234. LaTeX Warning: Hyper reference `mpi4py:module-mpi4py' on page 68 undefined on i nput line 9237. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRORS_ARE_FATAL:mpi4py.MP I.ERRORS_ARE_FATAL' on page 68 undefined on input line 9237. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRORS_RETURN:mpi4py.MPI.E RRORS_RETURN' on page 68 undefined on input line 9238. LaTeX Warning: Hyper reference `mpi4py:module-mpi4py' on page 68 undefined on i nput line 9241. LaTeX Warning: Hyper reference `mpi4py:module-mpi4py' on page 68 undefined on i nput line 9278. LaTeX Warning: Hyper reference `mpi4py:module-mpi4py' on page 68 undefined on i nput line 9287. [68] LaTeX Warning: Hyper reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 69 undefined on input line 9423. Package tabulary Warning: No suitable columns! on input line 9423. LaTeX Warning: Hyper reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 69 undefined on input line 9423. [69] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BottomType:mpi4py.MPI.Bott omType' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BOTTOM:mpi4py.MPI.BOTTOM' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BufferAutomaticType:mpi4py .MPI.BufferAutomaticType' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BUFFER_AUTOMATIC:mpi4py.MP I.BUFFER_AUTOMATIC' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Distgraphcomm:mpi4py.MPI.D istgraphcomm' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Grequest:mpi4py.MPI.Greque st' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.InPlaceType:mpi4py.MPI.InP laceType' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.IN_PLACE:mpi4py.MPI.IN_PLA CE' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session ' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.memory:mpi4py.MPI.memory' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 70 undefined on input line 9626. Package tabulary Warning: No suitable columns! on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BottomType:mpi4py.MPI.Bott omType' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BOTTOM:mpi4py.MPI.BOTTOM' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BufferAutomaticType:mpi4py .MPI.BufferAutomaticType' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BUFFER_AUTOMATIC:mpi4py.MP I.BUFFER_AUTOMATIC' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Distgraphcomm:mpi4py.MPI.D istgraphcomm' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Grequest:mpi4py.MPI.Greque st' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.InPlaceType:mpi4py.MPI.InP laceType' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.IN_PLACE:mpi4py.MPI.IN_PLA CE' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session ' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.memory:mpi4py.MPI.memory' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 70 undefined on input line 9626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BOTTOM:mpi4py.MPI.BOTTOM' on page 70 undefined on input line 9645. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BUFFER_AUTOMATIC:mpi4py.MP I.BUFFER_AUTOMATIC' on page 70 undefined on input line 9680. [70] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm' on page 71 undefined on input line 9712. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm' on page 71 undefined on input line 9725. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.Get_cart_rank' on page 71 undefined on input line 9787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.Get_coords' on page 71 undefined on input line 9787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.Get_dim' on page 71 undefined on input line 9787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.Get_topo' on page 71 undefined on input line 9787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.Shift' on page 71 undefined on input line 9787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Sendr ecv' on page 71 undefined on input line 9787. Underfull \hbox (badness 5533) in paragraph at lines 9787--9787 []|\T1/qtm/m/n/10 Re-turn a pro-cess ranks for data shift-ing with LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.Sub' on page 71 undefined on input line 9787. Package tabulary Warning: No suitable columns! on input line 9787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.Get_cart_rank' on page 71 undefined on input line 9787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.Get_coords' on page 71 undefined on input line 9787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.Get_dim' on page 71 undefined on input line 9787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.Get_topo' on page 71 undefined on input line 9787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.Shift' on page 71 undefined on input line 9787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Sendr ecv' on page 71 undefined on input line 9787. Underfull \hbox (badness 5533) in paragraph at lines 9787--9787 []|\T1/qtm/m/n/10 Re-turn a pro-cess ranks for data shift-ing with LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.Sub' on page 71 undefined on input line 9787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.coords' on page 71 undefined on input line 9842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.dim' on page 71 undefined on input line 9842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.dims' on page 71 undefined on input line 9842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.ndim' on page 71 undefined on input line 9842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.periods' on page 71 undefined on input line 9842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.topo' on page 71 undefined on input line 9842. Package tabulary Warning: No suitable columns! on input line 9842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.coords' on page 71 undefined on input line 9842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.dim' on page 71 undefined on input line 9842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.dims' on page 71 undefined on input line 9842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.ndim' on page 71 undefined on input line 9842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.periods' on page 71 undefined on input line 9842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm.topo' on page 71 undefined on input line 9842. [71] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Sendr ecv' on page 72 undefined on input line 9934. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm' on page 72 undefined on input line 9971. [72] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 73 undefined on input line 10079. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Abort ' on page 73 undefined on input line 10120. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Ack_f ailed' on page 73 undefined on input line 10127. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Agree ' on page 73 undefined on input line 10134. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allga ther' on page 73 undefined on input line 10141. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allga ther_init' on page 73 undefined on input line 10148. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allga therv' on page 73 undefined on input line 10155. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allga therv_init' on page 73 undefined on input line 10162. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allre duce' on page 73 undefined on input line 10169. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allre duce_init' on page 73 undefined on input line 10176. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allto all' on page 73 undefined on input line 10183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allto all_init' on page 73 undefined on input line 10190. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allto allv' on page 73 undefined on input line 10197. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allto allv_init' on page 73 undefined on input line 10204. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allto allw' on page 73 undefined on input line 10211. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Allto allw_init' on page 73 undefined on input line 10218. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Attac h_buffer' on page 73 undefined on input line 10225. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Barri er' on page 73 undefined on input line 10232. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Barri er_init' on page 73 undefined on input line 10239. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Bcast ' on page 73 undefined on input line 10246. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Bcast _init' on page 73 undefined on input line 10253. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Bsend ' on page 73 undefined on input line 10260. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Bsend _init' on page 73 undefined on input line 10267. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Call_ errhandler' on page 73 undefined on input line 10274. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Clone ' on page 73 undefined on input line 10281. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Compa re' on page 73 undefined on input line 10288. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Creat e' on page 73 undefined on input line 10295. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Creat e_errhandler' on page 73 undefined on input line 10302. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Creat e_keyval' on page 73 undefined on input line 10309. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Delet e_attr' on page 73 undefined on input line 10316. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Detac h_buffer' on page 73 undefined on input line 10323. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Disco nnect' on page 73 undefined on input line 10330. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Dup' on page 73 undefined on input line 10337. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Dup_w ith_info' on page 73 undefined on input line 10344. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Flush _buffer' on page 73 undefined on input line 10351. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Free' on page 73 undefined on input line 10358. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Free_ keyval' on page 73 undefined on input line 10365. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Gathe r' on page 73 undefined on input line 10372. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Gathe r_init' on page 73 undefined on input line 10379. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Gathe rv' on page 73 undefined on input line 10386. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Gathe rv_init' on page 73 undefined on input line 10393. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_a ttr' on page 73 undefined on input line 10400. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_e rrhandler' on page 73 undefined on input line 10407. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_f ailed' on page 73 undefined on input line 10414. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_g roup' on page 73 undefined on input line 10421. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_i nfo' on page 73 undefined on input line 10428. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_n ame' on page 73 undefined on input line 10435. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_p arent' on page 73 undefined on input line 10442. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_r ank' on page 73 undefined on input line 10449. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_s ize' on page 73 undefined on input line 10456. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Get_t opology' on page 73 undefined on input line 10463. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iagre e' on page 73 undefined on input line 10470. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iallg ather' on page 73 undefined on input line 10477. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iallg atherv' on page 73 undefined on input line 10484. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iallr educe' on page 73 undefined on input line 10491. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iallt oall' on page 73 undefined on input line 10498. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iallt oallv' on page 73 undefined on input line 10505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iallt oallw' on page 73 undefined on input line 10512. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Ibarr ier' on page 73 undefined on input line 10519. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Ibcas t' on page 73 undefined on input line 10526. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Ibsen d' on page 73 undefined on input line 10533. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Idup' on page 73 undefined on input line 10540. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Idup_ with_info' on page 73 undefined on input line 10547. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iflus h_buffer' on page 73 undefined on input line 10554. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Igath er' on page 73 undefined on input line 10561. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Igath erv' on page 73 undefined on input line 10568. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Impro be' on page 73 undefined on input line 10575. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iprob e' on page 73 undefined on input line 10582. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Irecv ' on page 73 undefined on input line 10589. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iredu ce' on page 73 undefined on input line 10596. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iredu ce_scatter' on page 73 undefined on input line 10603. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iredu ce_scatter_block' on page 73 undefined on input line 10610. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Irsen d' on page 73 undefined on input line 10617. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Is_in ter' on page 73 undefined on input line 10624. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Is_in tra' on page 73 undefined on input line 10631. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Is_re voked' on page 73 undefined on input line 10638. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iscat ter' on page 73 undefined on input line 10645. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Iscat terv' on page 73 undefined on input line 10652. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Isend ' on page 73 undefined on input line 10659. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Isend recv' on page 73 undefined on input line 10666. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Isend recv_replace' on page 73 undefined on input line 10673. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Ishri nk' on page 73 undefined on input line 10680. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Issen d' on page 73 undefined on input line 10687. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Join' on page 73 undefined on input line 10694. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Mprob e' on page 73 undefined on input line 10701. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Precv _init' on page 73 undefined on input line 10708. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Probe ' on page 73 undefined on input line 10715. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Psend _init' on page 73 undefined on input line 10722. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Recv' on page 73 undefined on input line 10729. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Recv_ init' on page 73 undefined on input line 10736. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Reduc e' on page 73 undefined on input line 10743. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Reduc e_init' on page 73 undefined on input line 10750. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Reduc e_scatter' on page 73 undefined on input line 10757. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Reduc e_scatter_block' on page 73 undefined on input line 10764. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Reduc e_scatter_block_init' on page 73 undefined on input line 10771. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Reduc e_scatter_init' on page 73 undefined on input line 10778. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Revok e' on page 73 undefined on input line 10785. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Rsend ' on page 73 undefined on input line 10792. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Rsend _init' on page 73 undefined on input line 10799. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Scatt er' on page 73 undefined on input line 10806. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Scatt er_init' on page 73 undefined on input line 10813. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Scatt erv' on page 73 undefined on input line 10820. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Scatt erv_init' on page 73 undefined on input line 10827. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Send' on page 73 undefined on input line 10834. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Send_ init' on page 73 undefined on input line 10841. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Sendr ecv' on page 73 undefined on input line 10848. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Sendr ecv_replace' on page 73 undefined on input line 10855. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Set_a ttr' on page 73 undefined on input line 10862. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Set_e rrhandler' on page 73 undefined on input line 10869. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Set_i nfo' on page 73 undefined on input line 10876. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Set_n ame' on page 73 undefined on input line 10883. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Shrin k' on page 73 undefined on input line 10890. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Split ' on page 73 undefined on input line 10897. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Split _type' on page 73 undefined on input line 10904. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Ssend ' on page 73 undefined on input line 10911. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Ssend _init' on page 73 undefined on input line 10918. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.allga ther' on page 73 undefined on input line 10925. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.allre duce' on page 73 undefined on input line 10932. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.allto all' on page 73 undefined on input line 10939. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.barri er' on page 73 undefined on input line 10946. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.bcast ' on page 73 undefined on input line 10953. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.bsend ' on page 73 undefined on input line 10960. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.f2py' on page 73 undefined on input line 10967. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.free' on page 73 undefined on input line 10974. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Free' on page 73 undefined on input line 10977. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.fromh andle' on page 73 undefined on input line 10981. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.gathe r' on page 73 undefined on input line 10988. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.ibsen d' on page 73 undefined on input line 10995. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.impro be' on page 73 undefined on input line 11002. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.iprob e' on page 73 undefined on input line 11009. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.irecv ' on page 73 undefined on input line 11016. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.isend ' on page 73 undefined on input line 11023. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.issen d' on page 73 undefined on input line 11030. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.mprob e' on page 73 undefined on input line 11037. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.probe ' on page 73 undefined on input line 11044. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.py2f' on page 73 undefined on input line 11051. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.recv' on page 73 undefined on input line 11058. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.reduc e' on page 73 undefined on input line 11065. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.scatt er' on page 73 undefined on input line 11072. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.send' on page 73 undefined on input line 11079. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.sendr ecv' on page 73 undefined on input line 11086. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.ssend ' on page 73 undefined on input line 11093. Package longtable Warning: Column widths have changed (longtable) in table 2 on input line 11099. [73] [74] [75] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.group ' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.handl e' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.info' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.is_in ter' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.is_in tra' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.is_to po' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.name' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.rank' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.size' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.topol ogy' on page 76 undefined on input line 11183. Package tabulary Warning: No suitable columns! on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.group ' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.handl e' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.info' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.is_in ter' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.is_in tra' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.is_to po' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.name' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.rank' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.size' on page 76 undefined on input line 11183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.topol ogy' on page 76 undefined on input line 11183. [76] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 77 undefined on input line 11278. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 77 undefined on input line 11278. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 7 7 undefined on input line 11282. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 77 undefined on input line 11307. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 77 undefined on input line 11307. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 7 7 undefined on input line 11311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 77 undefined on input line 11315. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 77 undefined on input line 11321. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 77 undefined on input line 11344. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 77 undefined on input line 11344. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 7 7 undefined on input line 11348. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 77 undefined on input line 11373. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 77 undefined on input line 11373. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 7 7 undefined on input line 11377. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 77 undefined on input line 11381. [77] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 78 undefined on input line 11387. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 78 undefined on input line 11406. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 78 undefined on input line 11406. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 78 undefined on input line 11410. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 78 undefined on input line 11414. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 78 undefined on input line 11439. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 78 undefined on input line 11439. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 78 undefined on input line 11443. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 78 undefined on input line 11447. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 78 undefined on input line 11451. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 78 undefined on input line 11457. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 7 8 undefined on input line 11479. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 78 undefined on input line 11479. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 7 8 undefined on input line 11483. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 7 8 undefined on input line 11508. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 78 undefined on input line 11508. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 7 8 undefined on input line 11512. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 78 undefined on input line 11516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 78 undefined on input line 11522. [78] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 7 9 undefined on input line 11545. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 79 undefined on input line 11545. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 7 9 undefined on input line 11549. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 7 9 undefined on input line 11574. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 79 undefined on input line 11574. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 7 9 undefined on input line 11578. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 79 undefined on input line 11582. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 79 undefined on input line 11588. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 7 9 undefined on input line 11611. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 79 undefined on input line 11611. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 7 9 undefined on input line 11615. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 7 9 undefined on input line 11640. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 79 undefined on input line 11640. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 7 9 undefined on input line 11644. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 79 undefined on input line 11648. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 79 undefined on input line 11654. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 79 undefined on input line 11672. [79] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 80 undefined on input line 11712. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 80 undefined on input line 11716. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 80 undefined on input line 11735. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 80 undefined on input line 11764. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 80 undefined on input line 11772. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 80 undefined on input line 11778. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 80 undefined on input line 11797. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 80 undefined on input line 11830. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 80 undefined on input line 11844. [80] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 81 undefined on input line 11902. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 81 undefined on input line 11924. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 81 undefined on input line 11928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 81 undefined on input line 11946. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 81 undefined on input line 11950. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 81 undefined on input line 11969. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 81 undefined on input line 11973. [81] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 82 undefined on input line 12023. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 82 undefined on input line 12059. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 82 undefined on input line 12081. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 82 undefined on input line 12162. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 82 undefined on input line 12162. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 8 2 undefined on input line 12166. [82] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 83 undefined on input line 12195. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 83 undefined on input line 12195. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 8 3 undefined on input line 12199. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 83 undefined on input line 12207. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 83 undefined on input line 12213. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 83 undefined on input line 12236. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 83 undefined on input line 12236. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 8 3 undefined on input line 12240. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 83 undefined on input line 12269. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 83 undefined on input line 12269. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 8 3 undefined on input line 12273. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 83 undefined on input line 12281. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 83 undefined on input line 12287. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 83 undefined on input line 12327. [83] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 84 undefined on input line 12345. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 84 undefined on input line 12363. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 84 undefined on input line 12381. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 84 undefined on input line 12417. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 84 undefined on input line 12489. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 84 undefined on input line 12493. [84] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 85 undefined on input line 12512. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 85 undefined on input line 12512. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 8 5 undefined on input line 12516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 85 undefined on input line 12522. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 85 undefined on input line 12541. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 85 undefined on input line 12541. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 8 5 undefined on input line 12545. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 85 undefined on input line 12551. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 85 undefined on input line 12570. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 85 undefined on input line 12570. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 85 undefined on input line 12574. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 85 undefined on input line 12578. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 85 undefined on input line 12584. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 8 5 undefined on input line 12603. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 85 undefined on input line 12603. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 8 5 undefined on input line 12607. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 85 undefined on input line 12613. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 8 5 undefined on input line 12632. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 85 undefined on input line 12632. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 8 5 undefined on input line 12636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 85 undefined on input line 12642. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 8 5 undefined on input line 12661. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 85 undefined on input line 12661. [85] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 8 6 undefined on input line 12665. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 86 undefined on input line 12671. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 86 undefined on input line 12689. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 86 undefined on input line 12708. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 86 undefined on input line 12718. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 86 undefined on input line 12737. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 86 undefined on input line 12751. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 86 undefined on input line 12769. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 86 undefined on input line 12773. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 86 undefined on input line 12791. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 86 undefined on input line 12795. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 86 undefined on input line 12813. [86] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 87 undefined on input line 12832. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 87 undefined on input line 12832. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 8 7 undefined on input line 12836. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 87 undefined on input line 12846. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 87 undefined on input line 12865. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 87 undefined on input line 12865. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 8 7 undefined on input line 12869. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 87 undefined on input line 12879. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 87 undefined on input line 12906. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 87 undefined on input line 12912. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 87 undefined on input line 12939. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 87 undefined on input line 12964. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 87 undefined on input line 12978. [87] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 88 undefined on input line 12997. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 88 undefined on input line 12997. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 88 undefined on input line 13001. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 88 undefined on input line 13005. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 88 undefined on input line 13015. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 88 undefined on input line 13034. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 88 undefined on input line 13034. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 88 undefined on input line 13038. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 88 undefined on input line 13046. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 88 undefined on input line 13052. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 8 8 undefined on input line 13071. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 88 undefined on input line 13071. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 88 undefined on input line 13075. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 8 8 undefined on input line 13075. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 88 undefined on input line 13079. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 88 undefined on input line 13085. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 88 undefined on input line 13104. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 88 undefined on input line 13118. [88] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 8 9 undefined on input line 13191. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 89 undefined on input line 13195. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 89 undefined on input line 13195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 89 undefined on input line 13205. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 8 9 undefined on input line 13224. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 89 undefined on input line 13228. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 89 undefined on input line 13228. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 89 undefined on input line 13238. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 89 undefined on input line 13257. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 89 undefined on input line 13271. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 89 undefined on input line 13290. [89] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 90 undefined on input line 13302. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 90 undefined on input line 13316. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 90 undefined on input line 13348. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 90 undefined on input line 13370. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 90 undefined on input line 13388. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 90 undefined on input line 13388. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 90 undefined on input line 13407. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 90 undefined on input line 13421. [90] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 91 undefined on input line 13443. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 91 undefined on input line 13470. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 91 undefined on input line 13476. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 91 undefined on input line 13495. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 91 undefined on input line 13511. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 91 undefined on input line 13517. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 91 undefined on input line 13549. [91] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 92 undefined on input line 13574. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 92 undefined on input line 13590. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 92 undefined on input line 13596. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 92 undefined on input line 13620. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 92 undefined on input line 13632. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 92 undefined on input line 13657. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 92 undefined on input line 13671. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 92 undefined on input line 13690. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 92 undefined on input line 13690. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 92 undefined on input line 13694. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 92 undefined on input line 13698. [92] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 93 undefined on input line 13727. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 93 undefined on input line 13727. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 93 undefined on input line 13731. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 93 undefined on input line 13735. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 93 undefined on input line 13743. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 93 undefined on input line 13749. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 93 undefined on input line 13768. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 93 undefined on input line 13768. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 93 undefined on input line 13772. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 93 undefined on input line 13780. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 9 3 undefined on input line 13805. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 93 undefined on input line 13805. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 93 undefined on input line 13809. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 9 3 undefined on input line 13809. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 93 undefined on input line 13813. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 9 3 undefined on input line 13838. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 93 undefined on input line 13838. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 93 undefined on input line 13842. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 9 3 undefined on input line 13842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 93 undefined on input line 13846. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 93 undefined on input line 13850. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 93 undefined on input line 13856. [93] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 94 undefined on input line 13875. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 94 undefined on input line 13875. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 94 undefined on input line 13879. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 94 undefined on input line 13887. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 94 undefined on input line 13891. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 94 undefined on input line 13897. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 94 undefined on input line 13934. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 94 undefined on input line 13967. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 94 undefined on input line 13981. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 9 4 undefined on input line 14000. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 94 undefined on input line 14004. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 94 undefined on input line 14004. [94] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 9 5 undefined on input line 14033. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 95 undefined on input line 14037. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 95 undefined on input line 14037. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 95 undefined on input line 14045. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 95 undefined on input line 14051. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 9 5 undefined on input line 14074. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 95 undefined on input line 14078. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 95 undefined on input line 14078. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 9 5 undefined on input line 14107. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 95 undefined on input line 14111. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 95 undefined on input line 14111. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 95 undefined on input line 14119. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 95 undefined on input line 14125. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Send' on page 95 undefined on input line 14144. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 95 undefined on input line 14151. [95] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 96 undefined on input line 14184. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 96 undefined on input line 14198. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 96 undefined on input line 14230. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 96 undefined on input line 14242. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 96 undefined on input line 14254. [96] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 97 undefined on input line 14292. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 97 undefined on input line 14312. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 97 undefined on input line 14365. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 97 undefined on input line 14387. [97] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 98 undefined on input line 14431. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 98 undefined on input line 14460. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 98 undefined on input line 14487. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 98 undefined on input line 14493. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 98 undefined on input line 14512. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 98 undefined on input line 14545. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 98 undefined on input line 14559. [98] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 99 undefined on input line 14604. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Barri er' on page 99 undefined on input line 14650. [99] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 100 undefined on input line 14736. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm.Free' on page 100 undefined on input line 14750. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 100 undefined on input line 14776. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 100 undefined on input line 14838. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 100 undefined on input line 14865. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 100 undefined on input line 14871. [100] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 101 undefined on input line 14898. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 101 undefined on input line 14923. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 101 undefined on input line 14937. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 101 undefined on input line 14970. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 101 undefined on input line 15003. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 101 undefined on input line 15030. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 101 undefined on input line 15036. [101] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 102 undefined on input line 15063. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 102 undefined on input line 15103. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 102 undefined on input line 15115. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 102 undefined on input line 15144. [102] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 103 undefined on input line 15247. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 103 undefined on input line 15259. [103] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 104 undefined on input line 15454. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Commit' on page 104 undefined on input line 15495. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_contiguous' on page 104 undefined on input line 15502. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_darray' on page 104 undefined on input line 15509. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_f90_complex' on page 104 undefined on input line 15516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_f90_integer' on page 104 undefined on input line 15523. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_f90_real' on page 104 undefined on input line 15530. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_hindexed' on page 104 undefined on input line 15537. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_hindexed_block' on page 104 undefined on input line 15544. Underfull \hbox (badness 10000) in paragraph at lines 15543--15545 []|\T1/txtt/m/sl/10 Create_hindexed_block\T1/qtm/m/n/10 (blocklength, dis-place - LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_hvector' on page 104 undefined on input line 15551. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_indexed' on page 104 undefined on input line 15558. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_indexed_block' on page 104 undefined on input line 15565. Underfull \hbox (badness 10000) in paragraph at lines 15564--15566 []|\T1/txtt/m/sl/10 Create_indexed_block\T1/qtm/m/n/10 (blocklength, dis-place- LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_keyval' on page 104 undefined on input line 15572. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_resized' on page 104 undefined on input line 15579. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_struct' on page 104 undefined on input line 15586. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_subarray' on page 104 undefined on input line 15593. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Create_vector' on page 104 undefined on input line 15600. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Delete_attr' on page 104 undefined on input line 15607. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Dup' on page 104 undefined on input line 15614. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Free' on page 104 undefined on input line 15621. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Free_keyval' on page 104 undefined on input line 15628. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Get_attr' on page 104 undefined on input line 15635. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Get_contents' on page 104 undefined on input line 15642. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Get_envelope' on page 104 undefined on input line 15649. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Get_extent' on page 104 undefined on input line 15656. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Get_name' on page 104 undefined on input line 15663. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Get_size' on page 104 undefined on input line 15670. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Get_true_extent' on page 104 undefined on input line 15677. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Get_value_index' on page 104 undefined on input line 15684. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Match_size' on page 104 undefined on input line 15691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Pack' on page 104 undefined on input line 15698. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Pack_external' on page 104 undefined on input line 15705. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Pack_external_size' on page 104 undefined on input line 15712. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Pack_size' on page 104 undefined on input line 15719. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Set_attr' on page 104 undefined on input line 15726. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Set_name' on page 104 undefined on input line 15733. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Unpack' on page 104 undefined on input line 15740. Underfull \hbox (badness 5203) in paragraph at lines 15742--15745 []|\T1/qtm/m/n/10 Un-pack from con-tigu-ous mem-ory ac-cord-ing to LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Unpack_external' on page 104 undefined on input line 15747. Underfull \hbox (badness 5203) in paragraph at lines 15749--15752 []|\T1/qtm/m/n/10 Un-pack from con-tigu-ous mem-ory ac-cord-ing to LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.decode' on page 104 undefined on input line 15754. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.f2py' on page 104 undefined on input line 15761. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.free' on page 104 undefined on input line 15768. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Free' on page 104 undefined on input line 15771. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.fromcode' on page 104 undefined on input line 15775. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.fromhandle' on page 104 undefined on input line 15782. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.py2f' on page 104 undefined on input line 15789. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.tocode' on page 104 undefined on input line 15796. Package longtable Warning: Column widths have changed (longtable) in table 3 on input line 15802. [104] [105] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.combiner' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.contents' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.envelope' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.extent' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.handle' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.is_named' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.is_predefined' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.lb' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.name' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.size' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.true_extent' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.true_lb' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.true_ub' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.typechar' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.typestr' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.ub' on page 106 undefined on input line 15928. Package tabulary Warning: No suitable columns! on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.combiner' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.contents' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.envelope' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.extent' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.handle' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.is_named' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.is_predefined' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.lb' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.name' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.size' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.true_extent' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.true_lb' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.true_ub' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.typechar' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.typestr' on page 106 undefined on input line 15928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.ub' on page 106 undefined on input line 15928. [106] [107] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 108 undefined on input line 16273. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 108 undefined on input line 16277. [108] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 109 undefined on input line 16348. [109] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 110 undefined on input line 16544. [110] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 111 undefined on input line 16653. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 111 undefined on input line 16657. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 11 1 undefined on input line 16711. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 11 1 undefined on input line 16715. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 111 undefined on input line 16723. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 11 1 undefined on input line 16755. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 11 1 undefined on input line 16759. [111] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 112 undefined on input line 16834. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 11 2 undefined on input line 16910. [112] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 11 3 undefined on input line 16918. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 113 undefined on input line 16922. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 11 3 undefined on input line 16954. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 11 3 undefined on input line 16962. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 113 undefined on input line 16986. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 113 undefined on input line 17005. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe.Free' on page 113 undefined on input line 17019. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 113 undefined on input line 17045. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 113 undefined on input line 17067. [113] [114] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm' on page 115 undefined on input line 17315. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Distgraphcomm:mpi4py.MPI.D istgraphcomm' on page 115 undefined on input line 17328. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Distgraphcomm:mpi4py.MPI.D istgraphcomm.Get_dist_neighbors' on page 115 undefined on input line 17362. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Distgraphcomm:mpi4py.MPI.D istgraphcomm.Get_dist_neighbors_count' on page 115 undefined on input line 1736 2. Package tabulary Warning: No suitable columns! on input line 17362. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Distgraphcomm:mpi4py.MPI.D istgraphcomm.Get_dist_neighbors' on page 115 undefined on input line 17362. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Distgraphcomm:mpi4py.MPI.D istgraphcomm.Get_dist_neighbors_count' on page 115 undefined on input line 1736 2. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 115 undefined on input line 17431. [115] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.Free' on page 116 undefined on input line 17486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.f2py' on page 116 undefined on input line 17486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.free' on page 116 undefined on input line 17486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.Free' on page 116 undefined on input line 17486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.fromhandle' on page 116 undefined on input line 17486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.py2f' on page 116 undefined on input line 17486. Package tabulary Warning: No suitable columns! on input line 17486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.Free' on page 116 undefined on input line 17486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.f2py' on page 116 undefined on input line 17486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.free' on page 116 undefined on input line 17486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.Free' on page 116 undefined on input line 17486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.fromhandle' on page 116 undefined on input line 17486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.py2f' on page 116 undefined on input line 17486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.handle' on page 116 undefined on input line 17506. Package tabulary Warning: No suitable columns! on input line 17506. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.handle' on page 116 undefined on input line 17506. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 116 undefined on input line 17541. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler.Free' on page 116 undefined on input line 17555. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 116 undefined on input line 17581. [116] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 117 undefined on input line 17644. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Call_ errhandler' on page 117 undefined on input line 17685. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Close ' on page 117 undefined on input line 17692. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Creat e_errhandler' on page 117 undefined on input line 17699. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Delet e' on page 117 undefined on input line 17706. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Get_a mode' on page 117 undefined on input line 17713. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Get_a tomicity' on page 117 undefined on input line 17720. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Get_b yte_offset' on page 117 undefined on input line 17727. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Get_e rrhandler' on page 117 undefined on input line 17734. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Get_g roup' on page 117 undefined on input line 17741. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Get_i nfo' on page 117 undefined on input line 17748. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Get_p osition' on page 117 undefined on input line 17755. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Get_p osition_shared' on page 117 undefined on input line 17762. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Get_s ize' on page 117 undefined on input line 17769. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Get_t ype_extent' on page 117 undefined on input line 17776. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Get_v iew' on page 117 undefined on input line 17783. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Iread ' on page 117 undefined on input line 17790. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Iread _all' on page 117 undefined on input line 17797. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Iread _at' on page 117 undefined on input line 17804. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Iread _at_all' on page 117 undefined on input line 17811. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Iread _shared' on page 117 undefined on input line 17818. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Iwrit e' on page 117 undefined on input line 17825. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Iwrit e_all' on page 117 undefined on input line 17832. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Iwrit e_at' on page 117 undefined on input line 17839. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Iwrit e_at_all' on page 117 undefined on input line 17846. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Iwrit e_shared' on page 117 undefined on input line 17853. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Open' on page 117 undefined on input line 17860. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Preal locate' on page 117 undefined on input line 17867. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Read' on page 117 undefined on input line 17874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Read_ all' on page 117 undefined on input line 17881. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Read_ all_begin' on page 117 undefined on input line 17888. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Read_ all_end' on page 117 undefined on input line 17895. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Read_ at' on page 117 undefined on input line 17902. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Read_ at_all' on page 117 undefined on input line 17909. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Read_ at_all_begin' on page 117 undefined on input line 17916. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Read_ at_all_end' on page 117 undefined on input line 17923. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Read_ ordered' on page 117 undefined on input line 17930. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Read_ ordered_begin' on page 117 undefined on input line 17937. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Read_ ordered_end' on page 117 undefined on input line 17944. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Read_ shared' on page 117 undefined on input line 17951. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Seek' on page 117 undefined on input line 17958. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Seek_ shared' on page 117 undefined on input line 17965. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Set_a tomicity' on page 117 undefined on input line 17972. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Set_e rrhandler' on page 117 undefined on input line 17979. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Set_i nfo' on page 117 undefined on input line 17986. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Set_s ize' on page 117 undefined on input line 17993. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Set_v iew' on page 117 undefined on input line 18000. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Sync' on page 117 undefined on input line 18007. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Write ' on page 117 undefined on input line 18014. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Write _all' on page 117 undefined on input line 18021. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Write _all_begin' on page 117 undefined on input line 18028. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Write _all_end' on page 117 undefined on input line 18035. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Write _at' on page 117 undefined on input line 18042. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Write _at_all' on page 117 undefined on input line 18049. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Write _at_all_begin' on page 117 undefined on input line 18056. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Write _at_all_end' on page 117 undefined on input line 18063. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Write _ordered' on page 117 undefined on input line 18070. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Write _ordered_begin' on page 117 undefined on input line 18077. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Write _ordered_end' on page 117 undefined on input line 18084. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Write _shared' on page 117 undefined on input line 18091. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.f2py' on page 117 undefined on input line 18098. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.free' on page 117 undefined on input line 18105. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Close ' on page 117 undefined on input line 18108. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.fromh andle' on page 117 undefined on input line 18112. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.py2f' on page 117 undefined on input line 18119. Package longtable Warning: Column widths have changed (longtable) in table 4 on input line 18125. [117] [118] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.amode ' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.atomi city' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.group ' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.group _rank' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.group _size' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.handl e' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.info' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.size' on page 119 undefined on input line 18195. Package tabulary Warning: No suitable columns! on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.amode ' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.atomi city' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.group ' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.group _rank' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.group _size' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.handl e' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.info' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.size' on page 119 undefined on input line 18195. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 119 undefined on input line 18251. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 119 undefined on input line 18255. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 119 undefined on input line 18278. [119] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 120 undefined on input line 18366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 120 undefined on input line 18384. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 120 undefined on input line 18402. [120] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 121 undefined on input line 18486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 121 undefined on input line 18508. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 121 undefined on input line 18508. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 1 undefined on input line 18526. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 121 undefined on input line 18530. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 1 undefined on input line 18548. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 121 undefined on input line 18552. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 1 undefined on input line 18575. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 121 undefined on input line 18581. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 1 undefined on input line 18604. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 121 undefined on input line 18610. [121] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 2 undefined on input line 18628. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 122 undefined on input line 18632. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 2 undefined on input line 18650. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 122 undefined on input line 18654. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 2 undefined on input line 18672. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 122 undefined on input line 18676. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 2 undefined on input line 18699. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 122 undefined on input line 18705. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 2 undefined on input line 18728. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 122 undefined on input line 18734. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 2 undefined on input line 18752. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 122 undefined on input line 18756. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 122 undefined on input line 18775. [122] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 123 undefined on input line 18787. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 3 undefined on input line 18834. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 123 undefined on input line 18838. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 3 undefined on input line 18863. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 123 undefined on input line 18867. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 3 undefined on input line 18891. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 3 undefined on input line 18914. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 123 undefined on input line 18918. [123] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 4 undefined on input line 18947. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 124 undefined on input line 18951. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 4 undefined on input line 18980. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 124 undefined on input line 18984. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 4 undefined on input line 19013. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 4 undefined on input line 19038. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 124 undefined on input line 19042. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 4 undefined on input line 19067. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 124 undefined on input line 19071. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 4 undefined on input line 19095. [124] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 5 undefined on input line 19118. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 125 undefined on input line 19122. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 5 undefined on input line 19147. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 125 undefined on input line 19151. [125] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 126 undefined on input line 19255. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 126 undefined on input line 19277. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 126 undefined on input line 19326. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 126 undefined on input line 19330. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 126 undefined on input line 19338. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 6 undefined on input line 19381. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 126 undefined on input line 19385. [126] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 7 undefined on input line 19410. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 127 undefined on input line 19414. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 7 undefined on input line 19438. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 7 undefined on input line 19461. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 127 undefined on input line 19465. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 7 undefined on input line 19494. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 127 undefined on input line 19498. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 7 undefined on input line 19527. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 127 undefined on input line 19531. [127] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 8 undefined on input line 19560. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 8 undefined on input line 19585. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 128 undefined on input line 19589. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 8 undefined on input line 19614. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 128 undefined on input line 19618. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 8 undefined on input line 19642. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 8 undefined on input line 19665. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 128 undefined on input line 19669. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 12 8 undefined on input line 19694. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 128 undefined on input line 19698. [128] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 129 undefined on input line 19723. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File.Close ' on page 129 undefined on input line 19737. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 129 undefined on input line 19763. [129] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm' on page 130 undefined on input line 19897. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm' on page 130 undefined on input line 19910. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.Get_dims' on page 130 undefined on input line 19958. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.Get_neighbors' on page 130 undefined on input line 19958. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.Get_neighbors_count' on page 130 undefined on input line 19958. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.Get_topo' on page 130 undefined on input line 19958. Package tabulary Warning: No suitable columns! on input line 19958. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.Get_dims' on page 130 undefined on input line 19958. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.Get_neighbors' on page 130 undefined on input line 19958. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.Get_neighbors_count' on page 130 undefined on input line 19958. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.Get_topo' on page 130 undefined on input line 19958. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.dims' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.edges' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.index' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.nedges' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.neighbors' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.nneighbors' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.nnodes' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.topo' on page 130 undefined on input line 20027. Package tabulary Warning: No suitable columns! on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.dims' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.edges' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.index' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.nedges' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.neighbors' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.nneighbors' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.nnodes' on page 130 undefined on input line 20027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm.topo' on page 130 undefined on input line 20027. [130] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 131 undefined on input line 20224. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Grequest:mpi4py.MPI.Greque st' on page 131 undefined on input line 20237. [131] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Grequest:mpi4py.MPI.Greque st.Complete' on page 132 undefined on input line 20278. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Grequest:mpi4py.MPI.Greque st.Start' on page 132 undefined on input line 20278. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Grequest:mpi4py.MPI.Greque st.complete' on page 132 undefined on input line 20278. Package tabulary Warning: No suitable columns! on input line 20278. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Grequest:mpi4py.MPI.Greque st.Complete' on page 132 undefined on input line 20278. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Grequest:mpi4py.MPI.Greque st.Start' on page 132 undefined on input line 20278. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Grequest:mpi4py.MPI.Greque st.complete' on page 132 undefined on input line 20278. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Grequest:mpi4py.MPI.Greque st' on page 132 undefined on input line 20335. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 132 undefined on input line 20392. [132] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Com pare' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Cre ate_from_session_pset' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Dif ference' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Dup ' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Exc l' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Fre e' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Get _rank' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Get _size' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Inc l' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Int ersection' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Ran ge_excl' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Ran ge_incl' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Tra nslate_ranks' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Uni on' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.f2p y' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.fre e' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Fre e' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.fro mhandle' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.py2 f' on page 133 undefined on input line 20538. Package tabulary Warning: No suitable columns! on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Com pare' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Cre ate_from_session_pset' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Dif ference' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Dup ' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Exc l' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Fre e' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Get _rank' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Get _size' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Inc l' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Int ersection' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Ran ge_excl' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Ran ge_incl' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Tra nslate_ranks' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Uni on' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.f2p y' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.fre e' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Fre e' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.fro mhandle' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.py2 f' on page 133 undefined on input line 20538. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.han dle' on page 133 undefined on input line 20572. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.ran k' on page 133 undefined on input line 20572. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.siz e' on page 133 undefined on input line 20572. Package tabulary Warning: No suitable columns! on input line 20572. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.han dle' on page 133 undefined on input line 20572. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.ran k' on page 133 undefined on input line 20572. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.siz e' on page 133 undefined on input line 20572. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 133 undefined on input line 20588. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session ' on page 133 undefined on input line 20611. [133] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 134 undefined on input line 20640. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 134 undefined on input line 20644. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 134 undefined on input line 20785. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 134 undefined on input line 20789. [134] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 135 undefined on input line 20862. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 135 undefined on input line 20887. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 135 undefined on input line 20891. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 135 undefined on input line 20916. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group.Fre e' on page 135 undefined on input line 20930. [135] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 136 undefined on input line 20956. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.IN_PLACE:mpi4py.MPI.IN_PLA CE' on page 136 undefined on input line 21033. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 136 undefined on input line 21078. [136] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Creat e' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Creat e_env' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Delet e' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Dup' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Free' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Get' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Get_n keys' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Get_n thkey' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Set' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.clear ' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.copy' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.f2py' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.free' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Free' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.fromh andle' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.get' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.items ' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.keys' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.pop' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.popit em' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.py2f' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.updat e' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.value s' on page 137 undefined on input line 21252. Package tabulary Warning: No suitable columns! on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Creat e' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Creat e_env' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Delet e' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Dup' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Free' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Get' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Get_n keys' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Get_n thkey' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Set' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.clear ' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.copy' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.f2py' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.free' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Free' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.fromh andle' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.get' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.items ' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.keys' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.pop' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.popit em' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.py2f' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.updat e' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.value s' on page 137 undefined on input line 21252. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.handl e' on page 137 undefined on input line 21272. Package tabulary Warning: No suitable columns! on input line 21272. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.handl e' on page 137 undefined on input line 21272. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 137 undefined on input line 21288. [137] [138] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 139 undefined on input line 21518. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info.Free' on page 139 undefined on input line 21532. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 139 undefined on input line 21558. [139] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 140 undefined on input line 21704. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 140 undefined on input line 21767. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 140 undefined on input line 21780. [140] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.Create_from_groups' on page 141 undefined on input line 21828. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.Get_remote_group' on page 141 undefined on input line 21828. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.Get_remote_size' on page 141 undefined on input line 21828. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.Merge' on page 141 undefined on input line 21828. Package tabulary Warning: No suitable columns! on input line 21828. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.Create_from_groups' on page 141 undefined on input line 21828. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.Get_remote_group' on page 141 undefined on input line 21828. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.Get_remote_size' on page 141 undefined on input line 21828. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.Merge' on page 141 undefined on input line 21828. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.remote_group' on page 141 undefined on input line 21855. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.remote_size' on page 141 undefined on input line 21855. Package tabulary Warning: No suitable columns! on input line 21855. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.remote_group' on page 141 undefined on input line 21855. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm.remote_size' on page 141 undefined on input line 21855. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 141 undefined on input line 21872. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 141 undefined on input line 21880. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 141 undefined on input line 21892. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 141 undefined on input line 21896. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 141 undefined on input line 21902. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 141 undefined on input line 21920. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 141 undefined on input line 21960. [141] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 142 undefined on input line 22007. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 142 undefined on input line 22020. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Accept' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Cart_map' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Connect' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_cart' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_dist_graph' on page 142 undefined on input line 22187. Underfull \hbox (badness 10000) in paragraph at lines 22187--22187 []|\T1/txtt/m/sl/10 Create_dist_graph\T1/qtm/m/n/10 (sources, de-grees, des-ti- na- LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_dist_graph_adjacent' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_from_group' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_graph' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_group' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_intercomm' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Exscan' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Exscan_init' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Graph_map' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Iexscan' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Iscan' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Scan' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Scan_init' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Spawn' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Spawn_multiple' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.exscan' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.scan' on page 142 undefined on input line 22187. Package tabulary Warning: No suitable columns! on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Accept' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Cart_map' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Connect' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_cart' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_dist_graph' on page 142 undefined on input line 22187. Underfull \hbox (badness 10000) in paragraph at lines 22187--22187 []|\T1/txtt/m/sl/10 Create_dist_graph\T1/qtm/m/n/10 (sources, de-grees, des-ti- na- LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_dist_graph_adjacent' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_from_group' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_graph' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_group' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Create_intercomm' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Exscan' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Exscan_init' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Graph_map' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Iexscan' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Iscan' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Scan' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Scan_init' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Spawn' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.Spawn_multiple' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.exscan' on page 142 undefined on input line 22187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm.scan' on page 142 undefined on input line 22187. [142] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 143 undefined on input line 22208. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 143 undefined on input line 22218. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 143 undefined on input line 22270. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 143 undefined on input line 22280. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Cartcomm:mpi4py.MPI.Cartco mm' on page 143 undefined on input line 22313. [143] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 144 undefined on input line 22348. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Distgraphcomm:mpi4py.MPI.D istgraphcomm' on page 144 undefined on input line 22358. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 144 undefined on input line 22393. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Distgraphcomm:mpi4py.MPI.D istgraphcomm' on page 144 undefined on input line 22403. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 144 undefined on input line 22422. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 144 undefined on input line 22430. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 144 undefined on input line 22434. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 144 undefined on input line 22440. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Graphcomm:mpi4py.MPI.Graph comm' on page 144 undefined on input line 22473. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 144 undefined on input line 22492. [144] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 145 undefined on input line 22502. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 145 undefined on input line 22525. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 145 undefined on input line 22539. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 5 undefined on input line 22558. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 14 5 undefined on input line 22558. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 5 undefined on input line 22562. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 145 undefined on input line 22566. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 5 undefined on input line 22591. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 14 5 undefined on input line 22591. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 5 undefined on input line 22595. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 145 undefined on input line 22599. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 145 undefined on input line 22603. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 145 undefined on input line 22609. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 5 undefined on input line 22657. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 14 5 undefined on input line 22657. [145] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 6 undefined on input line 22661. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 146 undefined on input line 22665. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 146 undefined on input line 22671. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 6 undefined on input line 22690. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 14 6 undefined on input line 22690. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 6 undefined on input line 22694. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 146 undefined on input line 22698. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 146 undefined on input line 22704. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 6 undefined on input line 22723. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 14 6 undefined on input line 22723. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 6 undefined on input line 22727. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 146 undefined on input line 22731. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 6 undefined on input line 22756. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.InPlace' on page 14 6 undefined on input line 22756. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 6 undefined on input line 22760. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 146 undefined on input line 22764. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 146 undefined on input line 22768. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 146 undefined on input line 22774. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 146 undefined on input line 22805. [146] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 147 undefined on input line 22819. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 147 undefined on input line 22850. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 147 undefined on input line 22850. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intercomm:mpi4py.MPI.Inter comm' on page 147 undefined on input line 22864. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 147 undefined on input line 22887. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 147 undefined on input line 22916. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 147 undefined on input line 22957. [147] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .Iprobe' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .Irecv' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .Probe' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .Recv' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .f2py' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .free' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .fromhandle' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .iprobe' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .irecv' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .probe' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .py2f' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .recv' on page 148 undefined on input line 23061. Package tabulary Warning: No suitable columns! on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .Iprobe' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .Irecv' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .Probe' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .Recv' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .f2py' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .free' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .fromhandle' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .iprobe' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .irecv' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .probe' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .py2f' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .recv' on page 148 undefined on input line 23061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .handle' on page 148 undefined on input line 23081. Package tabulary Warning: No suitable columns! on input line 23081. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message .handle' on page 148 undefined on input line 23081. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 148 undefined on input line 23098. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 148 undefined on input line 23110. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 8 undefined on input line 23134. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 148 undefined on input line 23138. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 148 undefined on input line 23157. [148] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 149 undefined on input line 23169. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 14 9 undefined on input line 23194. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 149 undefined on input line 23198. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 149 undefined on input line 23223. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 149 undefined on input line 23263. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 149 undefined on input line 23282. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 149 undefined on input line 23294. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 149 undefined on input line 23318. [149] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 150 undefined on input line 23337. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 150 undefined on input line 23349. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 150 undefined on input line 23388. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 150 undefined on input line 23440. [150] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.Create' o n page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.Free' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.Is_commut ative' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.Reduce_lo cal' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.f2py' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.free' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.Free' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.fromhandl e' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.py2f' on page 151 undefined on input line 23516. Package tabulary Warning: No suitable columns! on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.Create' o n page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.Free' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.Is_commut ative' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.Reduce_lo cal' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.f2py' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.free' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.Free' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.fromhandl e' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.py2f' on page 151 undefined on input line 23516. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.handle' o n page 151 undefined on input line 23550. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.is_commut ative' on page 151 undefined on input line 23550. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.is_predef ined' on page 151 undefined on input line 23550. Package tabulary Warning: No suitable columns! on input line 23550. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.handle' o n page 151 undefined on input line 23550. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.is_commut ative' on page 151 undefined on input line 23550. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.is_predef ined' on page 151 undefined on input line 23550. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 151 undefined on input line 23567. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 151 undefined on input line 23567. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 151 undefined on input line 23567. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 15 1 undefined on input line 23632. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 15 1 undefined on input line 23636. [151] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 152 undefined on input line 23661. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op.Free' on page 152 undefined on input line 23675. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 152 undefined on input line 23701. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle' on page 152 undefined on input line 23788. [152] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.d umps' on page 153 undefined on input line 23836. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.d umps_oob' on page 153 undefined on input line 23836. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.l oads' on page 153 undefined on input line 23836. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.l oads_oob' on page 153 undefined on input line 23836. Package tabulary Warning: No suitable columns! on input line 23836. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.d umps' on page 153 undefined on input line 23836. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.d umps_oob' on page 153 undefined on input line 23836. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.l oads' on page 153 undefined on input line 23836. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.l oads_oob' on page 153 undefined on input line 23836. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.P ROTOCOL' on page 153 undefined on input line 23863. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.T HRESHOLD' on page 153 undefined on input line 23863. Package tabulary Warning: No suitable columns! on input line 23863. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.P ROTOCOL' on page 153 undefined on input line 23863. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle.T HRESHOLD' on page 153 undefined on input line 23863. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 153 undefined on input line 23905. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 153 undefined on input line 23923. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 153 undefined on input line 23946. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 153 undefined on input line 23950. [153] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 154 undefined on input line 24003. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 154 undefined on input line 24016. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Parrived' on page 154 undefined on input line 24078. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Pready' on page 154 undefined on input line 24078. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Pready_list' on page 154 undefined on input line 24078. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Pready_range' on page 154 undefined on input line 24078. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Start' on page 154 undefined on input line 24078. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Startall' on page 154 undefined on input line 24078. Package tabulary Warning: No suitable columns! on input line 24078. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Parrived' on page 154 undefined on input line 24078. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Pready' on page 154 undefined on input line 24078. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Pready_list' on page 154 undefined on input line 24078. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Pready_range' on page 154 undefined on input line 24078. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Start' on page 154 undefined on input line 24078. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st.Startall' on page 154 undefined on input line 24078. [154] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 155 undefined on input line 24207. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 155 undefined on input line 24246. [155] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Cancel' on page 156 undefined on input line 24287. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Free' on page 156 undefined on input line 24294. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Get_status' on page 156 undefined on input line 24301. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Get_status_all' on page 156 undefined on input line 24308. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Get_status_any' on page 156 undefined on input line 24315. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Get_status_some' on page 156 undefined on input line 24322. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Test' on page 156 undefined on input line 24329. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Testall' on page 156 undefined on input line 24336. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Testany' on page 156 undefined on input line 24343. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Testsome' on page 156 undefined on input line 24350. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Wait' on page 156 undefined on input line 24357. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Waitall' on page 156 undefined on input line 24364. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Waitany' on page 156 undefined on input line 24371. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Waitsome' on page 156 undefined on input line 24378. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .cancel' on page 156 undefined on input line 24385. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .f2py' on page 156 undefined on input line 24392. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .free' on page 156 undefined on input line 24399. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Free' on page 156 undefined on input line 24402. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .fromhandle' on page 156 undefined on input line 24406. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .get_status' on page 156 undefined on input line 24413. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .get_status_all' on page 156 undefined on input line 24420. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .get_status_any' on page 156 undefined on input line 24427. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .get_status_some' on page 156 undefined on input line 24434. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .py2f' on page 156 undefined on input line 24441. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .test' on page 156 undefined on input line 24448. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .testall' on page 156 undefined on input line 24455. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .testany' on page 156 undefined on input line 24462. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .testsome' on page 156 undefined on input line 24469. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .wait' on page 156 undefined on input line 24476. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .waitall' on page 156 undefined on input line 24483. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .waitany' on page 156 undefined on input line 24490. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .waitsome' on page 156 undefined on input line 24497. Package longtable Warning: Column widths have changed (longtable) in table 5 on input line 24503. [156] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .handle' on page 157 undefined on input line 24524. Package tabulary Warning: No suitable columns! on input line 24524. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .handle' on page 157 undefined on input line 24524. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 157 undefined on input line 24576. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 157 undefined on input line 24599. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 157 undefined on input line 24603. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 157 undefined on input line 24628. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 157 undefined on input line 24632. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 157 undefined on input line 24657. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 157 undefined on input line 24661. [157] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 158 undefined on input line 24685. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 158 undefined on input line 24708. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 158 undefined on input line 24712. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 158 undefined on input line 24737. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 158 undefined on input line 24741. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 158 undefined on input line 24766. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 158 undefined on input line 24770. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 158 undefined on input line 24794. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 158 undefined on input line 24817. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 158 undefined on input line 24821. [158] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 159 undefined on input line 24846. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 159 undefined on input line 24850. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 159 undefined on input line 24875. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 159 undefined on input line 24879. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 159 undefined on input line 24922. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request .Free' on page 159 undefined on input line 24936. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 159 undefined on input line 24962. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 159 undefined on input line 24980. [159] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 160 undefined on input line 25003. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 160 undefined on input line 25007. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 160 undefined on input line 25032. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 160 undefined on input line 25036. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 160 undefined on input line 25061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 160 undefined on input line 25065. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 160 undefined on input line 25104. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 160 undefined on input line 25127. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 160 undefined on input line 25131. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 160 undefined on input line 25156. [160] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 161 undefined on input line 25160. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 161 undefined on input line 25185. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 161 undefined on input line 25189. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 161 undefined on input line 25213. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 161 undefined on input line 25236. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 161 undefined on input line 25240. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 161 undefined on input line 25265. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 161 undefined on input line 25269. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 161 undefined on input line 25294. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 161 undefined on input line 25298. [161] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session ' on page 162 undefined on input line 25352. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Attach_buffer' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Call_errhandler' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Create_errhandler' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Create_group' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Detach_buffer' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Finalize' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Flush_buffer' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Get_errhandler' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Get_info' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Get_nth_pset' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Get_num_psets' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Get_pset_info' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Iflush_buffer' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Init' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Set_errhandler' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .f2py' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .free' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Finalize' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .fromhandle' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .py2f' on page 162 undefined on input line 25505. Package tabulary Warning: No suitable columns! on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Attach_buffer' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Call_errhandler' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Create_errhandler' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Create_group' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Detach_buffer' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Finalize' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Flush_buffer' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Get_errhandler' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Get_info' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Get_nth_pset' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Get_num_psets' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Get_pset_info' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Iflush_buffer' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Init' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Set_errhandler' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .f2py' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .free' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Finalize' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .fromhandle' on page 162 undefined on input line 25505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .py2f' on page 162 undefined on input line 25505. [162] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .handle' on page 163 undefined on input line 25525. Package tabulary Warning: No suitable columns! on input line 25525. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .handle' on page 163 undefined on input line 25525. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 163 undefined on input line 25541. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session ' on page 163 undefined on input line 25585. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 163 undefined on input line 25589. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 163 undefined on input line 25611. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 163 undefined on input line 25629. [163] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 164 undefined on input line 25683. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 164 undefined on input line 25701. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 164 undefined on input line 25724. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 164 undefined on input line 25748. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 164 undefined on input line 25774. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 164 undefined on input line 25792. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 164 undefined on input line 25811. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 164 undefined on input line 25815. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 164 undefined on input line 25839. [164] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session ' on page 165 undefined on input line 25862. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session .Finalize' on page 165 undefined on input line 25876. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session ' on page 165 undefined on input line 25902. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status' on page 165 undefined on input line 25965. [165] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.G et_count' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.G et_elements' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.G et_error' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.G et_source' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.G et_tag' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.I s_cancelled' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.S et_cancelled' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.S et_elements' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.S et_error' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.S et_source' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.S et_tag' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.f 2py' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.p y2f' on page 166 undefined on input line 26076. Package tabulary Warning: No suitable columns! on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.G et_count' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.G et_elements' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.G et_error' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.G et_source' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.G et_tag' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.I s_cancelled' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.S et_cancelled' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.S et_elements' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.S et_error' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.S et_source' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.S et_tag' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.f 2py' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.p y2f' on page 166 undefined on input line 26076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.c ancelled' on page 166 undefined on input line 26124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.c ount' on page 166 undefined on input line 26124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.e rror' on page 166 undefined on input line 26124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.s ource' on page 166 undefined on input line 26124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.t ag' on page 166 undefined on input line 26124. Package tabulary Warning: No suitable columns! on input line 26124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.c ancelled' on page 166 undefined on input line 26124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.c ount' on page 166 undefined on input line 26124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.e rror' on page 166 undefined on input line 26124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.s ource' on page 166 undefined on input line 26124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Status:mpi4py.MPI.Status.t ag' on page 166 undefined on input line 26124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 166 undefined on input line 26140. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 166 undefined on input line 26162. [166] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 167 undefined on input line 26291. [167] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 168 undefined on input line 26484. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm' on page 168 undefined on input line 26497. [168] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Ineighbor_allgather' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Ineighbor_allgatherv' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Ineighbor_alltoall' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Ineighbor_alltoallv' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Ineighbor_alltoallw' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_allgather' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_allgather_init' on page 169 undefined on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|\T1/txtt/m/sl/10 Neighbor_allgather_init\T1/qtm/m/n/10 (sendbuf, recvbuf[, LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_allgatherv' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_allgatherv_init' on page 169 undefined on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|\T1/txtt/m/sl/10 Neighbor_allgatherv_init\T1/qtm/m/n/10 (sendbuf, recvbuf[, LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_alltoall' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_alltoall_init' on page 169 undefined on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|\T1/txtt/m/sl/10 Neighbor_alltoall_init\T1/qtm/m/n/10 (sendbuf, recvbuf[, LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_alltoallv' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_alltoallv_init' on page 169 undefined on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|\T1/txtt/m/sl/10 Neighbor_alltoallv_init\T1/qtm/m/n/10 (sendbuf, recvbuf[, LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_alltoallw' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_alltoallw_init' on page 169 undefined on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|\T1/txtt/m/sl/10 Neighbor_alltoallw_init\T1/qtm/m/n/10 (sendbuf, recvbuf[, LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.neighbor_allgather' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.neighbor_alltoall' on page 169 undefined on input line 26636. Package tabulary Warning: No suitable columns! on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Ineighbor_allgather' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Ineighbor_allgatherv' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Ineighbor_alltoall' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Ineighbor_alltoallv' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Ineighbor_alltoallw' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_allgather' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_allgather_init' on page 169 undefined on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|\T1/txtt/m/sl/10 Neighbor_allgather_init\T1/qtm/m/n/10 (sendbuf, recvbuf[, LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_allgatherv' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_allgatherv_init' on page 169 undefined on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|\T1/txtt/m/sl/10 Neighbor_allgatherv_init\T1/qtm/m/n/10 (sendbuf, recvbuf[, LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_alltoall' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_alltoall_init' on page 169 undefined on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|\T1/txtt/m/sl/10 Neighbor_alltoall_init\T1/qtm/m/n/10 (sendbuf, recvbuf[, LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_alltoallv' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_alltoallv_init' on page 169 undefined on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|\T1/txtt/m/sl/10 Neighbor_alltoallv_init\T1/qtm/m/n/10 (sendbuf, recvbuf[, LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_alltoallw' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.Neighbor_alltoallw_init' on page 169 undefined on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|\T1/txtt/m/sl/10 Neighbor_alltoallw_init\T1/qtm/m/n/10 (sendbuf, recvbuf[, LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.neighbor_allgather' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.neighbor_alltoall' on page 169 undefined on input line 26636. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.degrees' on page 169 undefined on input line 26691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.indegree' on page 169 undefined on input line 26691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.inedges' on page 169 undefined on input line 26691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.inoutedges' on page 169 undefined on input line 26691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.outdegree' on page 169 undefined on input line 26691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.outedges' on page 169 undefined on input line 26691. Package tabulary Warning: No suitable columns! on input line 26691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.degrees' on page 169 undefined on input line 26691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.indegree' on page 169 undefined on input line 26691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.inedges' on page 169 undefined on input line 26691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.inoutedges' on page 169 undefined on input line 26691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.outdegree' on page 169 undefined on input line 26691. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Topocomm:mpi4py.MPI.Topoco mm.outedges' on page 169 undefined on input line 26691. [169] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 0 undefined on input line 26708. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 1 70 undefined on input line 26712. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 170 undefined on input line 26718. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 0 undefined on input line 26737. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 1 70 undefined on input line 26741. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 170 undefined on input line 26747. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 1 70 undefined on input line 26766. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 1 70 undefined on input line 26770. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 170 undefined on input line 26776. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 1 70 undefined on input line 26795. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 1 70 undefined on input line 26799. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 170 undefined on input line 26805. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 1 70 undefined on input line 26824. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 1 70 undefined on input line 26828. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 170 undefined on input line 26834. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 0 undefined on input line 26853. [170] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 1 71 undefined on input line 26857. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 1 undefined on input line 26882. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 1 71 undefined on input line 26886. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 171 undefined on input line 26890. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 171 undefined on input line 26896. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 1 undefined on input line 26915. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 1 71 undefined on input line 26919. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 1 undefined on input line 26944. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 1 71 undefined on input line 26948. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 171 undefined on input line 26952. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 171 undefined on input line 26958. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 1 71 undefined on input line 26977. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 1 71 undefined on input line 26981. [171] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 1 72 undefined on input line 27006. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecB' on page 1 72 undefined on input line 27010. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 172 undefined on input line 27014. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 172 undefined on input line 27020. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 1 72 undefined on input line 27039. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 1 72 undefined on input line 27043. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 1 72 undefined on input line 27068. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecV' on page 1 72 undefined on input line 27072. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 172 undefined on input line 27076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 172 undefined on input line 27082. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 1 72 undefined on input line 27101. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 1 72 undefined on input line 27105. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 1 72 undefined on input line 27130. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpecW' on page 1 72 undefined on input line 27134. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 172 undefined on input line 27138. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Prequest:mpi4py.MPI.Preque st' on page 172 undefined on input line 27144. [172] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 173 undefined on input line 27296. [173] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Accumul ate' on page 174 undefined on input line 27337. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Allocat e' on page 174 undefined on input line 27344. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Allocat e_shared' on page 174 undefined on input line 27351. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Attach' on page 174 undefined on input line 27358. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Call_er rhandler' on page 174 undefined on input line 27365. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Compare _and_swap' on page 174 undefined on input line 27372. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Complet e' on page 174 undefined on input line 27379. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Start' on page 174 undefined on input line 27382. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Create' on page 174 undefined on input line 27386. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Create_ dynamic' on page 174 undefined on input line 27393. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Create_ errhandler' on page 174 undefined on input line 27400. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Create_ keyval' on page 174 undefined on input line 27407. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Delete_ attr' on page 174 undefined on input line 27414. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Detach' on page 174 undefined on input line 27421. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Fence' on page 174 undefined on input line 27428. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Fetch_a nd_op' on page 174 undefined on input line 27435. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Flush' on page 174 undefined on input line 27442. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Flush_a ll' on page 174 undefined on input line 27449. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Flush_l ocal' on page 174 undefined on input line 27456. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Flush_l ocal_all' on page 174 undefined on input line 27463. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Free' o n page 174 undefined on input line 27470. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Free_ke yval' on page 174 undefined on input line 27477. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Get' on page 174 undefined on input line 27484. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Get_acc umulate' on page 174 undefined on input line 27491. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Get_att r' on page 174 undefined on input line 27498. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Get_err handler' on page 174 undefined on input line 27505. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Get_gro up' on page 174 undefined on input line 27512. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Get_inf o' on page 174 undefined on input line 27519. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Get_nam e' on page 174 undefined on input line 27526. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Lock' o n page 174 undefined on input line 27533. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Lock_al l' on page 174 undefined on input line 27540. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Post' o n page 174 undefined on input line 27547. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Put' on page 174 undefined on input line 27554. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Raccumu late' on page 174 undefined on input line 27561. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Rget' o n page 174 undefined on input line 27568. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Rget_ac cumulate' on page 174 undefined on input line 27575. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Rput' o n page 174 undefined on input line 27582. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Set_att r' on page 174 undefined on input line 27589. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Set_err handler' on page 174 undefined on input line 27596. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Set_inf o' on page 174 undefined on input line 27603. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Set_nam e' on page 174 undefined on input line 27610. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Shared_ query' on page 174 undefined on input line 27617. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Start' on page 174 undefined on input line 27624. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Sync' o n page 174 undefined on input line 27631. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Test' o n page 174 undefined on input line 27638. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Unlock' on page 174 undefined on input line 27645. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Unlock_ all' on page 174 undefined on input line 27652. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Wait' o n page 174 undefined on input line 27659. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Post' o n page 174 undefined on input line 27662. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.f2py' o n page 174 undefined on input line 27666. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.free' o n page 174 undefined on input line 27673. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Free' o n page 174 undefined on input line 27676. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.fromhan dle' on page 174 undefined on input line 27680. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.py2f' o n page 174 undefined on input line 27687. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.tomemor y' on page 174 undefined on input line 27694. Package longtable Warning: Column widths have changed (longtable) in table 6 on input line 27700. [174] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.attrs' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.flavor' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.group' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.group_r ank' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.group_s ize' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.handle' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.info' o n page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.model' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.name' o n page 175 undefined on input line 27777. Package tabulary Warning: No suitable columns! on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.attrs' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.flavor' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.group' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.group_r ank' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.group_s ize' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.handle' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.info' o n page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.model' on page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.name' o n page 175 undefined on input line 27777. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 5 undefined on input line 27794. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TargetSpec' on page 175 undefined on input line 27802. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 175 undefined on input line 27806. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 175 undefined on input line 27839. [175] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 176 undefined on input line 27843. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 176 undefined on input line 27876. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 176 undefined on input line 27880. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 176 undefined on input line 27904. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 6 undefined on input line 27949. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 6 undefined on input line 27953. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 6 undefined on input line 27957. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Start' on page 176 undefined on input line 27985. [176] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 177 undefined on input line 28008. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Bottom' on page 177 undefined on input line 28008. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 177 undefined on input line 28016. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 177 undefined on input line 28020. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 177 undefined on input line 28045. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 177 undefined on input line 28049. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 177 undefined on input line 28073. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 177 undefined on input line 28077. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 177 undefined on input line 28096. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 177 undefined on input line 28100. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 177 undefined on input line 28150. [177] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 8 undefined on input line 28195. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 8 undefined on input line 28199. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 178 undefined on input line 28211. [178] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 9 undefined on input line 28356. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TargetSpec' on page 179 undefined on input line 28364. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 9 undefined on input line 28389. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 17 9 undefined on input line 28393. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TargetSpec' on page 179 undefined on input line 28401. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 179 undefined on input line 28405. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 179 undefined on input line 28451. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 179 undefined on input line 28469. [179] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 180 undefined on input line 28487. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 180 undefined on input line 28579. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 18 0 undefined on input line 28608. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TargetSpec' on page 180 undefined on input line 28616. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 18 0 undefined on input line 28641. [180] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TargetSpec' on page 181 undefined on input line 28649. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 181 undefined on input line 28653. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 181 undefined on input line 28659. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 18 1 undefined on input line 28678. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TargetSpec' on page 181 undefined on input line 28686. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 181 undefined on input line 28692. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 18 1 undefined on input line 28711. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 18 1 undefined on input line 28715. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TargetSpec' on page 181 undefined on input line 28723. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 181 undefined on input line 28727. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 181 undefined on input line 28733. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.BufSpec' on page 18 1 undefined on input line 28752. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.TargetSpec' on page 181 undefined on input line 28760. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 181 undefined on input line 28766. [181] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 182 undefined on input line 28813. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 182 undefined on input line 28835. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 182 undefined on input line 28883. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 182 undefined on input line 28902. [182] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Post' o n page 183 undefined on input line 29002. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 183 undefined on input line 29025. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win.Free' o n page 183 undefined on input line 29039. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 183 undefined on input line 29065. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 183 undefined on input line 29098. [183] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 184 undefined on input line 29242. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.a llocate' on page 184 undefined on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.c ast' on page 184 undefined on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.f romaddress' on page 184 undefined on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.f rombuffer' on page 184 undefined on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.r elease' on page 184 undefined on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.t obytes' on page 184 undefined on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.t oreadonly' on page 184 undefined on input line 29311. Package tabulary Warning: No suitable columns! on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.a llocate' on page 184 undefined on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.c ast' on page 184 undefined on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.f romaddress' on page 184 undefined on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.f rombuffer' on page 184 undefined on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.r elease' on page 184 undefined on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.t obytes' on page 184 undefined on input line 29311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.t oreadonly' on page 184 undefined on input line 29311. [184] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.a ddress' on page 185 undefined on input line 29366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.f ormat' on page 185 undefined on input line 29366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.i temsize' on page 185 undefined on input line 29366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.n bytes' on page 185 undefined on input line 29366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.o bj' on page 185 undefined on input line 29366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.r eadonly' on page 185 undefined on input line 29366. Package tabulary Warning: No suitable columns! on input line 29366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.a ddress' on page 185 undefined on input line 29366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.f ormat' on page 185 undefined on input line 29366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.i temsize' on page 185 undefined on input line 29366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.n bytes' on page 185 undefined on input line 29366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.o bj' on page 185 undefined on input line 29366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer.r eadonly' on page 185 undefined on input line 29366. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 185 undefined on input line 29393. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 185 undefined on input line 29455. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 185 undefined on input line 29474. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 185 undefined on input line 29484. [185] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 186 undefined on input line 29542. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 186 undefined on input line 29637. [186] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion' on page 187 undefined on input line 29658. Package tabulary Warning: No suitable columns! on input line 29658. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion' on page 187 undefined on input line 29658. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion.Get_error_class' on page 187 undefined on input line 29728. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion.Get_error_code' on page 187 undefined on input line 29728. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion.Get_error_string' on page 187 undefined on input line 29728. Package tabulary Warning: No suitable columns! on input line 29728. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion.Get_error_class' on page 187 undefined on input line 29728. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion.Get_error_code' on page 187 undefined on input line 29728. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion.Get_error_string' on page 187 undefined on input line 29728. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion.error_class' on page 187 undefined on input line 29762. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion.error_code' on page 187 undefined on input line 29762. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion.error_string' on page 187 undefined on input line 29762. Package tabulary Warning: No suitable columns! on input line 29762. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion.error_class' on page 187 undefined on input line 29762. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion.error_code' on page 187 undefined on input line 29762. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Exception:mpi4py.MPI.Excep tion.error_string' on page 187 undefined on input line 29762. [187] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Add_error_class:mpi4py.MPI .Add_error_class' on page 188 undefined on input line 29891. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Add_error_code:mpi4py.MPI. Add_error_code' on page 188 undefined on input line 29898. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Add_error_string:mpi4py.MP I.Add_error_string' on page 188 undefined on input line 29905. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Aint_add:mpi4py.MPI.Aint_a dd' on page 188 undefined on input line 29912. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Aint_diff:mpi4py.MPI.Aint_ diff' on page 188 undefined on input line 29919. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Alloc_mem:mpi4py.MPI.Alloc _mem' on page 188 undefined on input line 29926. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Attach_buffer:mpi4py.MPI.A ttach_buffer' on page 188 undefined on input line 29933. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Close_port:mpi4py.MPI.Clos e_port' on page 188 undefined on input line 29940. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Compute_dims:mpi4py.MPI.Co mpute_dims' on page 188 undefined on input line 29947. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Detach_buffer:mpi4py.MPI.D etach_buffer' on page 188 undefined on input line 29954. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Finalize:mpi4py.MPI.Finali ze' on page 188 undefined on input line 29961. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Flush_buffer:mpi4py.MPI.Fl ush_buffer' on page 188 undefined on input line 29968. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Free_mem:mpi4py.MPI.Free_m em' on page 188 undefined on input line 29975. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Alloc_mem:mpi4py.MPI.Alloc _mem' on page 188 undefined on input line 29978. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_address:mpi4py.MPI.Get _address' on page 188 undefined on input line 29982. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_error_class:mpi4py.MPI .Get_error_class' on page 188 undefined on input line 29989. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_error_string:mpi4py.MP I.Get_error_string' on page 188 undefined on input line 29996. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_hw_resource_info:mpi4p y.MPI.Get_hw_resource_info' on page 188 undefined on input line 30003. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_library_version:mpi4py .MPI.Get_library_version' on page 188 undefined on input line 30010. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_processor_name:mpi4py. MPI.Get_processor_name' on page 188 undefined on input line 30017. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Get_version:mpi4py.MPI.Get _version' on page 188 undefined on input line 30024. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Iflush_buffer:mpi4py.MPI.I flush_buffer' on page 188 undefined on input line 30031. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 188 undefined on input line 30038. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init_thread:mpi4py.MPI.Ini t_thread' on page 188 undefined on input line 30045. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Is_finalized:mpi4py.MPI.Is _finalized' on page 188 undefined on input line 30052. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Finalize:mpi4py.MPI.Finali ze' on page 188 undefined on input line 30055. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Is_initialized:mpi4py.MPI. Is_initialized' on page 188 undefined on input line 30059. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 188 undefined on input line 30062. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Is_thread_main:mpi4py.MPI. Is_thread_main' on page 188 undefined on input line 30066. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 188 undefined on input line 30069. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init_thread:mpi4py.MPI.Ini t_thread' on page 188 undefined on input line 30069. Underfull \hbox (badness 10000) in paragraph at lines 30068--30071 []|\T1/qtm/m/n/10 In-di-cate whether this thread called \T1/txtt/m/sl/10 Init \ T1/qtm/m/n/10 or LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Lookup_name:mpi4py.MPI.Loo kup_name' on page 188 undefined on input line 30073. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Open_port:mpi4py.MPI.Open_ port' on page 188 undefined on input line 30080. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pcontrol:mpi4py.MPI.Pcontr ol' on page 188 undefined on input line 30087. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Publish_name:mpi4py.MPI.Pu blish_name' on page 188 undefined on input line 30094. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Query_thread:mpi4py.MPI.Qu ery_thread' on page 188 undefined on input line 30101. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Register_datarep:mpi4py.MP I.Register_datarep' on page 188 undefined on input line 30108. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Remove_error_class:mpi4py. MPI.Remove_error_class' on page 188 undefined on input line 30115. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Remove_error_code:mpi4py.M PI.Remove_error_code' on page 188 undefined on input line 30122. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Remove_error_string:mpi4py .MPI.Remove_error_string' on page 188 undefined on input line 30129. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Unpublish_name:mpi4py.MPI. Unpublish_name' on page 188 undefined on input line 30136. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Wtick:mpi4py.MPI.Wtick' on page 188 undefined on input line 30143. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Wtime:mpi4py.MPI.Wtime' on page 188 undefined on input line 30146. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Wtime:mpi4py.MPI.Wtime' on page 188 undefined on input line 30150. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.get_vendor:mpi4py.MPI.get_ vendor' on page 188 undefined on input line 30157. Package longtable Warning: Column widths have changed (longtable) in table 7 on input line 30163. [188] [189] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 190 undefined on input line 30341. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 190 undefined on input line 30347. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 190 undefined on input line 30370. [190] LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 191 undefined on input line 30458. [191] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Alloc_mem:mpi4py.MPI.Alloc _mem' on page 192 undefined on input line 30523. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.buffer:mpi4py.MPI.buffer' on page 192 undefined on input line 30527. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 192 undefined on input line 30554. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Bottom' on page 192 undefined on input line 30554. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 192 undefined on input line 30635. [192] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 193 undefined on input line 30727. [193] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Finalize:mpi4py.MPI.Finali ze' on page 194 undefined on input line 30796. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 194 undefined on input line 30819. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init:mpi4py.MPI.Init' on p age 194 undefined on input line 30842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Init_thread:mpi4py.MPI.Ini t_thread' on page 194 undefined on input line 30842. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 194 undefined on input line 30874. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 194 undefined on input line 30903. [194] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 195 undefined on input line 30966. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 195 undefined on input line 31023. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 195 undefined on input line 31023. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 195 undefined on input line 31023. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 195 undefined on input line 31027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 195 undefined on input line 31027. LaTeX Warning: Hyper reference `mpi4py.typing:mpi4py.typing.Buffer' on page 195 undefined on input line 31027. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 195 undefined on input line 31031. [195] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 196 undefined on input line 31150. [196] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Wtime:mpi4py.MPI.Wtime' on page 197 undefined on input line 31175. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNDEFINED:mpi4py.MPI.UNDEF INED' on page 197 undefined on input line 31276. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ANY_SOURCE:mpi4py.MPI.ANY_ SOURCE' on page 197 undefined on input line 31283. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ANY_TAG:mpi4py.MPI.ANY_TAG ' on page 197 undefined on input line 31290. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.PROC_NULL:mpi4py.MPI.PROC_ NULL' on page 197 undefined on input line 31297. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ROOT:mpi4py.MPI.ROOT' on p age 197 undefined on input line 31304. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BOTTOM:mpi4py.MPI.BOTTOM' on page 197 undefined on input line 31311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BottomType:mpi4py.MPI.Bott omType' on page 197 undefined on input line 31314. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.IN_PLACE:mpi4py.MPI.IN_PLA CE' on page 197 undefined on input line 31318. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.InPlaceType:mpi4py.MPI.InP laceType' on page 197 undefined on input line 31321. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.KEYVAL_INVALID:mpi4py.MPI. KEYVAL_INVALID' on page 197 undefined on input line 31325. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.TAG_UB:mpi4py.MPI.TAG_UB' on page 197 undefined on input line 31332. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.IO:mpi4py.MPI.IO' on page 197 undefined on input line 31339. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WTIME_IS_GLOBAL:mpi4py.MPI .WTIME_IS_GLOBAL' on page 197 undefined on input line 31346. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNIVERSE_SIZE:mpi4py.MPI.U NIVERSE_SIZE' on page 197 undefined on input line 31353. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.APPNUM:mpi4py.MPI.APPNUM' on page 197 undefined on input line 31360. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LASTUSEDCODE:mpi4py.MPI.LA STUSEDCODE' on page 197 undefined on input line 31367. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_BASE:mpi4py.MPI.WIN_BA SE' on page 197 undefined on input line 31374. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_SIZE:mpi4py.MPI.WIN_SI ZE' on page 197 undefined on input line 31381. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_DISP_UNIT:mpi4py.MPI.W IN_DISP_UNIT' on page 197 undefined on input line 31388. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_CREATE_FLAVOR:mpi4py.M PI.WIN_CREATE_FLAVOR' on page 197 undefined on input line 31395. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_FLAVOR:mpi4py.MPI.WIN_ FLAVOR' on page 197 undefined on input line 31402. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_MODEL:mpi4py.MPI.WIN_M ODEL' on page 197 undefined on input line 31409. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SUCCESS:mpi4py.MPI.SUCCESS ' on page 197 undefined on input line 31416. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_LASTCODE:mpi4py.MPI.ER R_LASTCODE' on page 197 undefined on input line 31423. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_TYPE:mpi4py.MPI.ERR_TY PE' on page 197 undefined on input line 31430. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_REQUEST:mpi4py.MPI.ERR _REQUEST' on page 197 undefined on input line 31437. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_OP:mpi4py.MPI.ERR_OP' on page 197 undefined on input line 31444. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_GROUP:mpi4py.MPI.ERR_G ROUP' on page 197 undefined on input line 31451. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_INFO:mpi4py.MPI.ERR_IN FO' on page 197 undefined on input line 31458. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_ERRHANDLER:mpi4py.MPI. ERR_ERRHANDLER' on page 197 undefined on input line 31465. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_SESSION:mpi4py.MPI.ERR _SESSION' on page 197 undefined on input line 31472. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_COMM:mpi4py.MPI.ERR_CO MM' on page 197 undefined on input line 31479. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_WIN:mpi4py.MPI.ERR_WIN ' on page 197 undefined on input line 31486. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_FILE:mpi4py.MPI.ERR_FI LE' on page 197 undefined on input line 31493. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_BUFFER:mpi4py.MPI.ERR_ BUFFER' on page 197 undefined on input line 31500. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_COUNT:mpi4py.MPI.ERR_C OUNT' on page 197 undefined on input line 31507. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_TAG:mpi4py.MPI.ERR_TAG ' on page 197 undefined on input line 31514. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RANK:mpi4py.MPI.ERR_RA NK' on page 197 undefined on input line 31521. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_ROOT:mpi4py.MPI.ERR_RO OT' on page 197 undefined on input line 31528. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_TRUNCATE:mpi4py.MPI.ER R_TRUNCATE' on page 197 undefined on input line 31535. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_IN_STATUS:mpi4py.MPI.E RR_IN_STATUS' on page 197 undefined on input line 31542. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_PENDING:mpi4py.MPI.ERR _PENDING' on page 197 undefined on input line 31549. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_TOPOLOGY:mpi4py.MPI.ER R_TOPOLOGY' on page 197 undefined on input line 31556. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_DIMS:mpi4py.MPI.ERR_DI MS' on page 197 undefined on input line 31563. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_ARG:mpi4py.MPI.ERR_ARG ' on page 197 undefined on input line 31570. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_OTHER:mpi4py.MPI.ERR_O THER' on page 197 undefined on input line 31577. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_UNKNOWN:mpi4py.MPI.ERR _UNKNOWN' on page 197 undefined on input line 31584. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_INTERN:mpi4py.MPI.ERR_ INTERN' on page 197 undefined on input line 31591. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_KEYVAL:mpi4py.MPI.ERR_ KEYVAL' on page 197 undefined on input line 31598. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_NO_MEM:mpi4py.MPI.ERR_ NO_MEM' on page 197 undefined on input line 31605. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_INFO_KEY:mpi4py.MPI.ER R_INFO_KEY' on page 197 undefined on input line 31612. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_INFO_VALUE:mpi4py.MPI. ERR_INFO_VALUE' on page 197 undefined on input line 31619. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_INFO_NOKEY:mpi4py.MPI. ERR_INFO_NOKEY' on page 197 undefined on input line 31626. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_SPAWN:mpi4py.MPI.ERR_S PAWN' on page 197 undefined on input line 31633. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_PORT:mpi4py.MPI.ERR_PO RT' on page 197 undefined on input line 31640. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_SERVICE:mpi4py.MPI.ERR _SERVICE' on page 197 undefined on input line 31647. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_NAME:mpi4py.MPI.ERR_NA ME' on page 197 undefined on input line 31654. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_PROC_ABORTED:mpi4py.MP I.ERR_PROC_ABORTED' on page 197 undefined on input line 31661. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_BASE:mpi4py.MPI.ERR_BA SE' on page 197 undefined on input line 31668. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_SIZE:mpi4py.MPI.ERR_SI ZE' on page 197 undefined on input line 31675. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_DISP:mpi4py.MPI.ERR_DI SP' on page 197 undefined on input line 31682. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_ASSERT:mpi4py.MPI.ERR_ ASSERT' on page 197 undefined on input line 31689. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_LOCKTYPE:mpi4py.MPI.ER R_LOCKTYPE' on page 197 undefined on input line 31696. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RMA_CONFLICT:mpi4py.MP I.ERR_RMA_CONFLICT' on page 197 undefined on input line 31703. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RMA_SYNC:mpi4py.MPI.ER R_RMA_SYNC' on page 197 undefined on input line 31710. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RMA_RANGE:mpi4py.MPI.E RR_RMA_RANGE' on page 197 undefined on input line 31717. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RMA_ATTACH:mpi4py.MPI. ERR_RMA_ATTACH' on page 197 undefined on input line 31724. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RMA_SHARED:mpi4py.MPI. ERR_RMA_SHARED' on page 197 undefined on input line 31731. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_RMA_FLAVOR:mpi4py.MPI. ERR_RMA_FLAVOR' on page 197 undefined on input line 31738. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_BAD_FILE:mpi4py.MPI.ER R_BAD_FILE' on page 197 undefined on input line 31745. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_NO_SUCH_FILE:mpi4py.MP I.ERR_NO_SUCH_FILE' on page 197 undefined on input line 31752. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_FILE_EXISTS:mpi4py.MPI .ERR_FILE_EXISTS' on page 197 undefined on input line 31759. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_FILE_IN_USE:mpi4py.MPI .ERR_FILE_IN_USE' on page 197 undefined on input line 31766. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_AMODE:mpi4py.MPI.ERR_A MODE' on page 197 undefined on input line 31773. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_ACCESS:mpi4py.MPI.ERR_ ACCESS' on page 197 undefined on input line 31780. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_READ_ONLY:mpi4py.MPI.E RR_READ_ONLY' on page 197 undefined on input line 31787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_NO_SPACE:mpi4py.MPI.ER R_NO_SPACE' on page 197 undefined on input line 31794. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_QUOTA:mpi4py.MPI.ERR_Q UOTA' on page 197 undefined on input line 31801. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_NOT_SAME:mpi4py.MPI.ER R_NOT_SAME' on page 197 undefined on input line 31808. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_IO:mpi4py.MPI.ERR_IO' on page 197 undefined on input line 31815. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_UNSUPPORTED_OPERATION: mpi4py.MPI.ERR_UNSUPPORTED_OPERATION' on page 197 undefined on input line 31822 . LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_UNSUPPORTED_DATAREP:mp i4py.MPI.ERR_UNSUPPORTED_DATAREP' on page 197 undefined on input line 31829. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_CONVERSION:mpi4py.MPI. ERR_CONVERSION' on page 197 undefined on input line 31836. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_DUP_DATAREP:mpi4py.MPI .ERR_DUP_DATAREP' on page 197 undefined on input line 31843. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_VALUE_TOO_LARGE:mpi4py .MPI.ERR_VALUE_TOO_LARGE' on page 197 undefined on input line 31850. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_REVOKED:mpi4py.MPI.ERR _REVOKED' on page 197 undefined on input line 31857. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_PROC_FAILED:mpi4py.MPI .ERR_PROC_FAILED' on page 197 undefined on input line 31864. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERR_PROC_FAILED_PENDING:mp i4py.MPI.ERR_PROC_FAILED_PENDING' on page 197 undefined on input line 31871. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ORDER_C:mpi4py.MPI.ORDER_C ' on page 197 undefined on input line 31878. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ORDER_FORTRAN:mpi4py.MPI.O RDER_FORTRAN' on page 197 undefined on input line 31885. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ORDER_F:mpi4py.MPI.ORDER_F ' on page 197 undefined on input line 31892. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.TYPECLASS_INTEGER:mpi4py.M PI.TYPECLASS_INTEGER' on page 197 undefined on input line 31899. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.TYPECLASS_REAL:mpi4py.MPI. TYPECLASS_REAL' on page 197 undefined on input line 31906. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.TYPECLASS_COMPLEX:mpi4py.M PI.TYPECLASS_COMPLEX' on page 197 undefined on input line 31913. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DISTRIBUTE_NONE:mpi4py.MPI .DISTRIBUTE_NONE' on page 197 undefined on input line 31920. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DISTRIBUTE_BLOCK:mpi4py.MP I.DISTRIBUTE_BLOCK' on page 197 undefined on input line 31927. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DISTRIBUTE_CYCLIC:mpi4py.M PI.DISTRIBUTE_CYCLIC' on page 197 undefined on input line 31934. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DISTRIBUTE_DFLT_DARG:mpi4p y.MPI.DISTRIBUTE_DFLT_DARG' on page 197 undefined on input line 31941. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_NAMED:mpi4py.MPI. COMBINER_NAMED' on page 197 undefined on input line 31948. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_DUP:mpi4py.MPI.CO MBINER_DUP' on page 197 undefined on input line 31955. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_CONTIGUOUS:mpi4py .MPI.COMBINER_CONTIGUOUS' on page 197 undefined on input line 31962. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_VECTOR:mpi4py.MPI .COMBINER_VECTOR' on page 197 undefined on input line 31969. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_HVECTOR:mpi4py.MP I.COMBINER_HVECTOR' on page 197 undefined on input line 31976. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_INDEXED:mpi4py.MP I.COMBINER_INDEXED' on page 197 undefined on input line 31983. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_HINDEXED:mpi4py.M PI.COMBINER_HINDEXED' on page 197 undefined on input line 31990. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_INDEXED_BLOCK:mpi 4py.MPI.COMBINER_INDEXED_BLOCK' on page 197 undefined on input line 31997. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_HINDEXED_BLOCK:mp i4py.MPI.COMBINER_HINDEXED_BLOCK' on page 197 undefined on input line 32004. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_STRUCT:mpi4py.MPI .COMBINER_STRUCT' on page 197 undefined on input line 32011. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_SUBARRAY:mpi4py.M PI.COMBINER_SUBARRAY' on page 197 undefined on input line 32018. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_DARRAY:mpi4py.MPI .COMBINER_DARRAY' on page 197 undefined on input line 32025. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_RESIZED:mpi4py.MP I.COMBINER_RESIZED' on page 197 undefined on input line 32032. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_VALUE_INDEX:mpi4p y.MPI.COMBINER_VALUE_INDEX' on page 197 undefined on input line 32039. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_F90_INTEGER:mpi4p y.MPI.COMBINER_F90_INTEGER' on page 197 undefined on input line 32046. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_F90_REAL:mpi4py.M PI.COMBINER_F90_REAL' on page 197 undefined on input line 32053. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMBINER_F90_COMPLEX:mpi4p y.MPI.COMBINER_F90_COMPLEX' on page 197 undefined on input line 32060. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_SOURCE:mpi4py.MPI.F_SOUR CE' on page 197 undefined on input line 32067. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_TAG:mpi4py.MPI.F_TAG' on page 197 undefined on input line 32074. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_ERROR:mpi4py.MPI.F_ERROR ' on page 197 undefined on input line 32081. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_STATUS_SIZE:mpi4py.MPI.F _STATUS_SIZE' on page 197 undefined on input line 32088. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.IDENT:mpi4py.MPI.IDENT' on page 197 undefined on input line 32095. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CONGRUENT:mpi4py.MPI.CONGR UENT' on page 197 undefined on input line 32102. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SIMILAR:mpi4py.MPI.SIMILAR ' on page 197 undefined on input line 32109. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNEQUAL:mpi4py.MPI.UNEQUAL ' on page 197 undefined on input line 32116. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CART:mpi4py.MPI.CART' on p age 197 undefined on input line 32123. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.GRAPH:mpi4py.MPI.GRAPH' on page 197 undefined on input line 32130. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DIST_GRAPH:mpi4py.MPI.DIST _GRAPH' on page 197 undefined on input line 32137. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNWEIGHTED:mpi4py.MPI.UNWE IGHTED' on page 197 undefined on input line 32144. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WEIGHTS_EMPTY:mpi4py.MPI.W EIGHTS_EMPTY' on page 197 undefined on input line 32151. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_TYPE_SHARED:mpi4py.MP I.COMM_TYPE_SHARED' on page 197 undefined on input line 32158. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_TYPE_HW_GUIDED:mpi4py .MPI.COMM_TYPE_HW_GUIDED' on page 197 undefined on input line 32165. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_TYPE_HW_UNGUIDED:mpi4 py.MPI.COMM_TYPE_HW_UNGUIDED' on page 197 undefined on input line 32172. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_TYPE_RESOURCE_GUIDED: mpi4py.MPI.COMM_TYPE_RESOURCE_GUIDED' on page 197 undefined on input line 32179 . LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BSEND_OVERHEAD:mpi4py.MPI. BSEND_OVERHEAD' on page 197 undefined on input line 32186. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BUFFER_AUTOMATIC:mpi4py.MP I.BUFFER_AUTOMATIC' on page 197 undefined on input line 32193. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BufferAutomaticType:mpi4py .MPI.BufferAutomaticType' on page 197 undefined on input line 32196. Underfull \hbox (badness 10000) in paragraph at lines 32195--32198 []|\T1/qtm/m/n/10 Con-stant \T1/txtt/m/n/10 BUFFER_AUTOMATIC \T1/qtm/m/n/10 of type LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_FLAVOR_CREATE:mpi4py.M PI.WIN_FLAVOR_CREATE' on page 197 undefined on input line 32200. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_FLAVOR_ALLOCATE:mpi4py .MPI.WIN_FLAVOR_ALLOCATE' on page 197 undefined on input line 32207. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_FLAVOR_DYNAMIC:mpi4py. MPI.WIN_FLAVOR_DYNAMIC' on page 197 undefined on input line 32214. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_FLAVOR_SHARED:mpi4py.M PI.WIN_FLAVOR_SHARED' on page 197 undefined on input line 32221. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_SEPARATE:mpi4py.MPI.WI N_SEPARATE' on page 197 undefined on input line 32228. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_UNIFIED:mpi4py.MPI.WIN _UNIFIED' on page 197 undefined on input line 32235. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_NOCHECK:mpi4py.MPI.MO DE_NOCHECK' on page 197 undefined on input line 32242. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_NOSTORE:mpi4py.MPI.MO DE_NOSTORE' on page 197 undefined on input line 32249. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_NOPUT:mpi4py.MPI.MODE _NOPUT' on page 197 undefined on input line 32256. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_NOPRECEDE:mpi4py.MPI. MODE_NOPRECEDE' on page 197 undefined on input line 32263. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_NOSUCCEED:mpi4py.MPI. MODE_NOSUCCEED' on page 197 undefined on input line 32270. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOCK_EXCLUSIVE:mpi4py.MPI. LOCK_EXCLUSIVE' on page 197 undefined on input line 32277. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOCK_SHARED:mpi4py.MPI.LOC K_SHARED' on page 197 undefined on input line 32284. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_RDONLY:mpi4py.MPI.MOD E_RDONLY' on page 197 undefined on input line 32291. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_WRONLY:mpi4py.MPI.MOD E_WRONLY' on page 197 undefined on input line 32298. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_RDWR:mpi4py.MPI.MODE_ RDWR' on page 197 undefined on input line 32305. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_CREATE:mpi4py.MPI.MOD E_CREATE' on page 197 undefined on input line 32312. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_EXCL:mpi4py.MPI.MODE_ EXCL' on page 197 undefined on input line 32319. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_DELETE_ON_CLOSE:mpi4p y.MPI.MODE_DELETE_ON_CLOSE' on page 197 undefined on input line 32326. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_UNIQUE_OPEN:mpi4py.MP I.MODE_UNIQUE_OPEN' on page 197 undefined on input line 32333. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_SEQUENTIAL:mpi4py.MPI .MODE_SEQUENTIAL' on page 197 undefined on input line 32340. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MODE_APPEND:mpi4py.MPI.MOD E_APPEND' on page 197 undefined on input line 32347. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SEEK_SET:mpi4py.MPI.SEEK_S ET' on page 197 undefined on input line 32354. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SEEK_CUR:mpi4py.MPI.SEEK_C UR' on page 197 undefined on input line 32361. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SEEK_END:mpi4py.MPI.SEEK_E ND' on page 197 undefined on input line 32368. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DISPLACEMENT_CURRENT:mpi4p y.MPI.DISPLACEMENT_CURRENT' on page 197 undefined on input line 32375. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DISP_CUR:mpi4py.MPI.DISP_C UR' on page 197 undefined on input line 32382. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_SINGLE:mpi4py.MPI.T HREAD_SINGLE' on page 197 undefined on input line 32389. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_FUNNELED:mpi4py.MPI .THREAD_FUNNELED' on page 197 undefined on input line 32396. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_SERIALIZED:mpi4py.M PI.THREAD_SERIALIZED' on page 197 undefined on input line 32403. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.THREAD_MULTIPLE:mpi4py.MPI .THREAD_MULTIPLE' on page 197 undefined on input line 32410. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.VERSION:mpi4py.MPI.VERSION ' on page 197 undefined on input line 32417. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SUBVERSION:mpi4py.MPI.SUBV ERSION' on page 197 undefined on input line 32424. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_PROCESSOR_NAME:mpi4py. MPI.MAX_PROCESSOR_NAME' on page 197 undefined on input line 32431. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_ERROR_STRING:mpi4py.MP I.MAX_ERROR_STRING' on page 197 undefined on input line 32438. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_PORT_NAME:mpi4py.MPI.M AX_PORT_NAME' on page 197 undefined on input line 32445. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_INFO_KEY:mpi4py.MPI.MA X_INFO_KEY' on page 197 undefined on input line 32452. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_INFO_VAL:mpi4py.MPI.MA X_INFO_VAL' on page 197 undefined on input line 32459. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_OBJECT_NAME:mpi4py.MPI .MAX_OBJECT_NAME' on page 197 undefined on input line 32466. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_DATAREP_STRING:mpi4py. MPI.MAX_DATAREP_STRING' on page 197 undefined on input line 32473. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_LIBRARY_VERSION_STRING :mpi4py.MPI.MAX_LIBRARY_VERSION_STRING' on page 197 undefined on input line 324 80. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_PSET_NAME_LEN:mpi4py.M PI.MAX_PSET_NAME_LEN' on page 197 undefined on input line 32487. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX_STRINGTAG_LEN:mpi4py.M PI.MAX_STRINGTAG_LEN' on page 197 undefined on input line 32494. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DATATYPE_NULL:mpi4py.MPI.D ATATYPE_NULL' on page 197 undefined on input line 32501. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32504. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.PACKED:mpi4py.MPI.PACKED' on page 197 undefined on input line 32508. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32511. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BYTE:mpi4py.MPI.BYTE' on p age 197 undefined on input line 32515. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32518. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.AINT:mpi4py.MPI.AINT' on p age 197 undefined on input line 32522. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32525. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.OFFSET:mpi4py.MPI.OFFSET' on page 197 undefined on input line 32529. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32532. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COUNT:mpi4py.MPI.COUNT' on page 197 undefined on input line 32536. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32539. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CHAR:mpi4py.MPI.CHAR' on p age 197 undefined on input line 32543. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32546. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WCHAR:mpi4py.MPI.WCHAR' on page 197 undefined on input line 32550. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32553. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SIGNED_CHAR:mpi4py.MPI.SIG NED_CHAR' on page 197 undefined on input line 32557. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32560. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SHORT:mpi4py.MPI.SHORT' on page 197 undefined on input line 32564. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32567. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INT:mpi4py.MPI.INT' on pag e 197 undefined on input line 32571. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32574. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LONG:mpi4py.MPI.LONG' on p age 197 undefined on input line 32578. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32581. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LONG_LONG:mpi4py.MPI.LONG_ LONG' on page 197 undefined on input line 32585. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32588. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNSIGNED_CHAR:mpi4py.MPI.U NSIGNED_CHAR' on page 197 undefined on input line 32592. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32595. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNSIGNED_SHORT:mpi4py.MPI. UNSIGNED_SHORT' on page 197 undefined on input line 32599. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32602. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNSIGNED:mpi4py.MPI.UNSIGN ED' on page 197 undefined on input line 32606. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32609. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNSIGNED_LONG:mpi4py.MPI.U NSIGNED_LONG' on page 197 undefined on input line 32613. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32616. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNSIGNED_LONG_LONG:mpi4py. MPI.UNSIGNED_LONG_LONG' on page 197 undefined on input line 32620. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32623. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.FLOAT:mpi4py.MPI.FLOAT' on page 197 undefined on input line 32627. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32630. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DOUBLE:mpi4py.MPI.DOUBLE' on page 197 undefined on input line 32634. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32637. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LONG_DOUBLE:mpi4py.MPI.LON G_DOUBLE' on page 197 undefined on input line 32641. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32644. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.C_BOOL:mpi4py.MPI.C_BOOL' on page 197 undefined on input line 32648. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32651. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INT8_T:mpi4py.MPI.INT8_T' on page 197 undefined on input line 32655. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32658. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INT16_T:mpi4py.MPI.INT16_T ' on page 197 undefined on input line 32662. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32665. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INT32_T:mpi4py.MPI.INT32_T ' on page 197 undefined on input line 32669. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32672. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INT64_T:mpi4py.MPI.INT64_T ' on page 197 undefined on input line 32676. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32679. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UINT8_T:mpi4py.MPI.UINT8_T ' on page 197 undefined on input line 32683. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32686. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UINT16_T:mpi4py.MPI.UINT16 _T' on page 197 undefined on input line 32690. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32693. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UINT32_T:mpi4py.MPI.UINT32 _T' on page 197 undefined on input line 32697. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32700. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UINT64_T:mpi4py.MPI.UINT64 _T' on page 197 undefined on input line 32704. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32707. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.C_COMPLEX:mpi4py.MPI.C_COM PLEX' on page 197 undefined on input line 32711. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32714. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.C_FLOAT_COMPLEX:mpi4py.MPI .C_FLOAT_COMPLEX' on page 197 undefined on input line 32718. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32721. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.C_DOUBLE_COMPLEX:mpi4py.MP I.C_DOUBLE_COMPLEX' on page 197 undefined on input line 32725. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32728. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.C_LONG_DOUBLE_COMPLEX:mpi4 py.MPI.C_LONG_DOUBLE_COMPLEX' on page 197 undefined on input line 32732. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32735. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CXX_BOOL:mpi4py.MPI.CXX_BO OL' on page 197 undefined on input line 32739. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32742. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CXX_FLOAT_COMPLEX:mpi4py.M PI.CXX_FLOAT_COMPLEX' on page 197 undefined on input line 32746. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32749. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CXX_DOUBLE_COMPLEX:mpi4py. MPI.CXX_DOUBLE_COMPLEX' on page 197 undefined on input line 32753. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32756. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CXX_LONG_DOUBLE_COMPLEX:mp i4py.MPI.CXX_LONG_DOUBLE_COMPLEX' on page 197 undefined on input line 32760. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32763. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SHORT_INT:mpi4py.MPI.SHORT _INT' on page 197 undefined on input line 32767. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32770. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INT_INT:mpi4py.MPI.INT_INT ' on page 197 undefined on input line 32774. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32777. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.TWOINT:mpi4py.MPI.TWOINT' on page 197 undefined on input line 32781. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32784. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LONG_INT:mpi4py.MPI.LONG_I NT' on page 197 undefined on input line 32788. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32791. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.FLOAT_INT:mpi4py.MPI.FLOAT _INT' on page 197 undefined on input line 32795. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32798. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DOUBLE_INT:mpi4py.MPI.DOUB LE_INT' on page 197 undefined on input line 32802. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32805. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LONG_DOUBLE_INT:mpi4py.MPI .LONG_DOUBLE_INT' on page 197 undefined on input line 32809. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32812. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.CHARACTER:mpi4py.MPI.CHARA CTER' on page 197 undefined on input line 32816. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32819. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOGICAL:mpi4py.MPI.LOGICAL ' on page 197 undefined on input line 32823. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32826. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INTEGER:mpi4py.MPI.INTEGER ' on page 197 undefined on input line 32830. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32833. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REAL:mpi4py.MPI.REAL' on p age 197 undefined on input line 32837. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32840. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DOUBLE_PRECISION:mpi4py.MP I.DOUBLE_PRECISION' on page 197 undefined on input line 32844. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32847. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMPLEX:mpi4py.MPI.COMPLEX ' on page 197 undefined on input line 32851. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32854. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.DOUBLE_COMPLEX:mpi4py.MPI. DOUBLE_COMPLEX' on page 197 undefined on input line 32858. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32861. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOGICAL1:mpi4py.MPI.LOGICA L1' on page 197 undefined on input line 32865. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32868. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOGICAL2:mpi4py.MPI.LOGICA L2' on page 197 undefined on input line 32872. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32875. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOGICAL4:mpi4py.MPI.LOGICA L4' on page 197 undefined on input line 32879. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32882. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOGICAL8:mpi4py.MPI.LOGICA L8' on page 197 undefined on input line 32886. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32889. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INTEGER1:mpi4py.MPI.INTEGE R1' on page 197 undefined on input line 32893. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32896. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INTEGER2:mpi4py.MPI.INTEGE R2' on page 197 undefined on input line 32900. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32903. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INTEGER4:mpi4py.MPI.INTEGE R4' on page 197 undefined on input line 32907. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32910. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INTEGER8:mpi4py.MPI.INTEGE R8' on page 197 undefined on input line 32914. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32917. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INTEGER16:mpi4py.MPI.INTEG ER16' on page 197 undefined on input line 32921. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32924. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REAL2:mpi4py.MPI.REAL2' on page 197 undefined on input line 32928. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32931. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REAL4:mpi4py.MPI.REAL4' on page 197 undefined on input line 32935. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32938. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REAL8:mpi4py.MPI.REAL8' on page 197 undefined on input line 32942. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32945. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REAL16:mpi4py.MPI.REAL16' on page 197 undefined on input line 32949. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32952. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMPLEX4:mpi4py.MPI.COMPLE X4' on page 197 undefined on input line 32956. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32959. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMPLEX8:mpi4py.MPI.COMPLE X8' on page 197 undefined on input line 32963. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32966. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMPLEX16:mpi4py.MPI.COMPL EX16' on page 197 undefined on input line 32970. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32973. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMPLEX32:mpi4py.MPI.COMPL EX32' on page 197 undefined on input line 32977. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32980. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.UNSIGNED_INT:mpi4py.MPI.UN SIGNED_INT' on page 197 undefined on input line 32984. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32987. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SIGNED_SHORT:mpi4py.MPI.SI GNED_SHORT' on page 197 undefined on input line 32991. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 32994. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SIGNED_INT:mpi4py.MPI.SIGN ED_INT' on page 197 undefined on input line 32998. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33001. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SIGNED_LONG:mpi4py.MPI.SIG NED_LONG' on page 197 undefined on input line 33005. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33008. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SIGNED_LONG_LONG:mpi4py.MP I.SIGNED_LONG_LONG' on page 197 undefined on input line 33012. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33015. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BOOL:mpi4py.MPI.BOOL' on p age 197 undefined on input line 33019. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33022. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SINT8_T:mpi4py.MPI.SINT8_T ' on page 197 undefined on input line 33026. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33029. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SINT16_T:mpi4py.MPI.SINT16 _T' on page 197 undefined on input line 33033. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33036. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SINT32_T:mpi4py.MPI.SINT32 _T' on page 197 undefined on input line 33040. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33043. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SINT64_T:mpi4py.MPI.SINT64 _T' on page 197 undefined on input line 33047. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33050. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_BOOL:mpi4py.MPI.F_BOOL' on page 197 undefined on input line 33054. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33057. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_INT:mpi4py.MPI.F_INT' on page 197 undefined on input line 33061. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33064. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_FLOAT:mpi4py.MPI.F_FLOAT ' on page 197 undefined on input line 33068. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33071. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_DOUBLE:mpi4py.MPI.F_DOUB LE' on page 197 undefined on input line 33075. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33078. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_COMPLEX:mpi4py.MPI.F_COM PLEX' on page 197 undefined on input line 33082. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33085. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_FLOAT_COMPLEX:mpi4py.MPI .F_FLOAT_COMPLEX' on page 197 undefined on input line 33089. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33092. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.F_DOUBLE_COMPLEX:mpi4py.MP I.F_DOUBLE_COMPLEX' on page 197 undefined on input line 33096. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 197 undefined on input line 33099. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REQUEST_NULL:mpi4py.MPI.RE QUEST_NULL' on page 197 undefined on input line 33103. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 197 undefined on input line 33106. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MESSAGE_NULL:mpi4py.MPI.ME SSAGE_NULL' on page 197 undefined on input line 33110. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 197 undefined on input line 33113. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MESSAGE_NO_PROC:mpi4py.MPI .MESSAGE_NO_PROC' on page 197 undefined on input line 33117. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 197 undefined on input line 33120. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.OP_NULL:mpi4py.MPI.OP_NULL ' on page 197 undefined on input line 33124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33127. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAX:mpi4py.MPI.MAX' on pag e 197 undefined on input line 33131. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33134. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MIN:mpi4py.MPI.MIN' on pag e 197 undefined on input line 33138. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33141. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SUM:mpi4py.MPI.SUM' on pag e 197 undefined on input line 33145. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33148. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.PROD:mpi4py.MPI.PROD' on p age 197 undefined on input line 33152. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33155. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LAND:mpi4py.MPI.LAND' on p age 197 undefined on input line 33159. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33162. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BAND:mpi4py.MPI.BAND' on p age 197 undefined on input line 33166. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33169. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LOR:mpi4py.MPI.LOR' on pag e 197 undefined on input line 33173. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33176. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BOR:mpi4py.MPI.BOR' on pag e 197 undefined on input line 33180. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33183. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.LXOR:mpi4py.MPI.LXOR' on p age 197 undefined on input line 33187. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33190. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BXOR:mpi4py.MPI.BXOR' on p age 197 undefined on input line 33194. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33197. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MAXLOC:mpi4py.MPI.MAXLOC' on page 197 undefined on input line 33201. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33204. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.MINLOC:mpi4py.MPI.MINLOC' on page 197 undefined on input line 33208. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33211. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.REPLACE:mpi4py.MPI.REPLACE ' on page 197 undefined on input line 33215. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33218. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.NO_OP:mpi4py.MPI.NO_OP' on page 197 undefined on input line 33222. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 197 undefined on input line 33225. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.GROUP_NULL:mpi4py.MPI.GROU P_NULL' on page 197 undefined on input line 33229. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 197 undefined on input line 33232. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.GROUP_EMPTY:mpi4py.MPI.GRO UP_EMPTY' on page 197 undefined on input line 33236. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 197 undefined on input line 33239. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INFO_NULL:mpi4py.MPI.INFO_ NULL' on page 197 undefined on input line 33243. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 197 undefined on input line 33246. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.INFO_ENV:mpi4py.MPI.INFO_E NV' on page 197 undefined on input line 33250. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 197 undefined on input line 33253. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRHANDLER_NULL:mpi4py.MPI .ERRHANDLER_NULL' on page 197 undefined on input line 33257. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 197 undefined on input line 33260. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRORS_RETURN:mpi4py.MPI.E RRORS_RETURN' on page 197 undefined on input line 33264. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 197 undefined on input line 33267. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRORS_ABORT:mpi4py.MPI.ER RORS_ABORT' on page 197 undefined on input line 33271. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 197 undefined on input line 33274. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.ERRORS_ARE_FATAL:mpi4py.MP I.ERRORS_ARE_FATAL' on page 197 undefined on input line 33278. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 197 undefined on input line 33281. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.SESSION_NULL:mpi4py.MPI.SE SSION_NULL' on page 197 undefined on input line 33285. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session ' on page 197 undefined on input line 33288. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_NULL:mpi4py.MPI.COMM_ NULL' on page 197 undefined on input line 33292. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 197 undefined on input line 33295. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_SELF:mpi4py.MPI.COMM_ SELF' on page 197 undefined on input line 33299. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 197 undefined on input line 33302. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.COMM_WORLD:mpi4py.MPI.COMM _WORLD' on page 197 undefined on input line 33306. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 197 undefined on input line 33309. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.WIN_NULL:mpi4py.MPI.WIN_NU LL' on page 197 undefined on input line 33313. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 197 undefined on input line 33316. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.FILE_NULL:mpi4py.MPI.FILE_ NULL' on page 197 undefined on input line 33320. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 197 undefined on input line 33323. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.pickle:mpi4py.MPI.pickle' on page 197 undefined on input line 33327. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle' on page 197 undefined on input line 33330. [197] [198] [199] [200] Package longtable Warning: Column widths have changed (longtable) in table 8 on input line 33333. [201] [202] [203] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BottomType:mpi4py.MPI.Bott omType' on page 204 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BottomType:mpi4py.MPI.Bott omType' on page 204 undefined on input line 33435. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.InPlaceType:mpi4py.MPI.InP laceType' on page 204 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.InPlaceType:mpi4py.MPI.InP laceType' on page 204 undefined on input line 33452. [204] [205] [206] [207] [208] [209] [210] [211] [212] [213] [214] [215] [216] [217] [218] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BufferAutomaticType:mpi4py .MPI.BufferAutomaticType' on page 219 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.BufferAutomaticType:mpi4py .MPI.BufferAutomaticType' on page 219 undefined on input line 35577. [219] [220] [221] [222] [223] [224] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 225 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 225 undefined on input line 36325. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 225 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 225 undefined on input line 36342. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 225 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 225 undefined on input line 36359. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 225 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 225 undefined on input line 36376. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 225 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 225 undefined on input line 36393. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 225 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 225 undefined on input line 36410. [225] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 36427. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 36444. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 36461. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 36478. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 36495. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 36512. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 36529. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 226 undefined on input line 36546. [226] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 36563. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 36580. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 36597. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 36614. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 36631. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 36648. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 36665. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 227 undefined on input line 36682. [227] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 36699. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 36716. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 36733. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 36750. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 36767. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 36784. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 36801. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 228 undefined on input line 36818. [228] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 36835. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 36852. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 36869. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 36886. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 36903. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 36920. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 36937. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 229 undefined on input line 36954. [229] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 36971. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 36988. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 37005. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 37022. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 37039. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 37056. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 37073. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 230 undefined on input line 37090. [230] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 37107. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 37124. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 37141. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 37158. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 37175. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 37192. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 37209. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 231 undefined on input line 37226. [231] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 37243. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 37260. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 37277. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 37294. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 37311. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 37328. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 37345. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 232 undefined on input line 37362. [232] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 37379. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 37396. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 37413. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 37430. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 37447. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 37464. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 37481. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 233 undefined on input line 37498. [233] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 37515. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 37532. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 37549. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 37566. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 37583. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 37600. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 37617. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 234 undefined on input line 37634. [234] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 37651. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 37668. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 37685. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 37702. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 37719. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 37736. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 37753. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Datatype:mpi4py.MPI.Dataty pe' on page 235 undefined on input line 37770. [235] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 236 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Request:mpi4py.MPI.Request ' on page 236 undefined on input line 37787. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 236 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 236 undefined on input line 37804. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 236 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Message:mpi4py.MPI.Message ' on page 236 undefined on input line 37821. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 236 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 236 undefined on input line 37838. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 236 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 236 undefined on input line 37872. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 236 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 236 undefined on input line 37906. [236] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 237 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 237 undefined on input line 37940. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 237 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 237 undefined on input line 37974. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 237 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 237 undefined on input line 38008. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 237 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 237 undefined on input line 38042. [237] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 238 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 238 undefined on input line 38076. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 238 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 238 undefined on input line 38110. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 238 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 238 undefined on input line 38144. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 238 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 238 undefined on input line 38178. [238] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 239 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 239 undefined on input line 38212. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 239 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 239 undefined on input line 38246. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 239 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 239 undefined on input line 38280. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 239 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Op:mpi4py.MPI.Op' on page 239 undefined on input line 38314. [239] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 240 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 240 undefined on input line 38348. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 240 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Group:mpi4py.MPI.Group' on page 240 undefined on input line 38365. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 240 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 240 undefined on input line 38382. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 240 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Info:mpi4py.MPI.Info' on p age 240 undefined on input line 38399. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 240 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 240 undefined on input line 38416. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 240 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 240 undefined on input line 38433. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 240 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 240 undefined on input line 38450. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 240 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Errhandler:mpi4py.MPI.Errh andler' on page 240 undefined on input line 38467. [240] LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session ' on page 241 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Session:mpi4py.MPI.Session ' on page 241 undefined on input line 38484. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 241 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Comm:mpi4py.MPI.Comm' on p age 241 undefined on input line 38501. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 241 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 241 undefined on input line 38518. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 241 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Intracomm:mpi4py.MPI.Intra comm' on page 241 undefined on input line 38535. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 241 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Win:mpi4py.MPI.Win' on pag e 241 undefined on input line 38552. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 241 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.File:mpi4py.MPI.File' on p age 241 undefined on input line 38569. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle' on page 241 undefined on input line 1. LaTeX Warning: Hyper reference `reference/mpi4py.MPI.Pickle:mpi4py.MPI.Pickle' on page 241 undefined on input line 38586. [241] LaTeX Warning: Hyper reference `install:envvar-MPI4PY_BUILD_BACKEND' on page 24 2 undefined on input line 38650. LaTeX Warning: Hyper reference `install:envvar-MPI4PY_BUILD_BACKEND' on page 24 2 undefined on input line 38679. LaTeX Warning: Hyper reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 242 undefined on input line 38697. LaTeX Warning: Hyper reference `install:envvar-MPI4PY_BUILD_MPICC' on page 242 undefined on input line 38698. LaTeX Warning: Hyper reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 242 undefined on input line 38713. LaTeX Warning: Hyper reference `install:envvar-MPI4PY_BUILD_MPILD' on page 242 undefined on input line 38714. LaTeX Warning: Hyper reference `install:envvar-MPI4PY_BUILD_MPICFG' on page 242 undefined on input line 38733. LaTeX Warning: Hyper reference `install:envvar-MPI4PY_BUILD_CONFIGURE' on page 242 undefined on input line 38751. [242] LaTeX Warning: Hyper reference `install:envvar-MPI4PY_BUILD_MPICC' on page 243 undefined on input line 38773. LaTeX Warning: Hyper reference `install:envvar-MPI4PY_BUILD_MPILD' on page 243 undefined on input line 38785. LaTeX Warning: Hyper reference `install:envvar-MPI4PY_BUILD_MPICFG' on page 243 undefined on input line 38797. LaTeX Warning: Hyper reference `install:envvar-MPI4PY_BUILD_BACKEND' on page 24 3 undefined on input line 38807. LaTeX Warning: Hyper reference `install:envvar-MPI4PY_BUILD_BACKEND' on page 24 3 undefined on input line 38828. [243] [244] [245] [246] LaTeX Warning: Hyper reference `install:envvar-MPICFG' on page 247 undefined on input line 39142. LaTeX Warning: Hyper reference `install:envvar-MPICC' on page 247 undefined on input line 39156. LaTeX Warning: Hyper reference `install:envvar-MPILD' on page 247 undefined on input line 39169. LaTeX Warning: Hyper reference `develop:cmdoption-mpicc' on page 247 undefined on input line 39209. LaTeX Warning: Hyper reference `install:envvar-MPICC' on page 247 undefined on input line 39210. [247] [248] LaTeX Warning: Citation `guidelines:mpi-abi-paper' on page 249 undefined on inp ut line 39420. LaTeX Warning: Citation `guidelines:mpi-abi-issue' on page 249 undefined on inp ut line 39421. [249] [250] Underfull \hbox (badness 10000) in paragraph at lines 39629--39632 []\T1/qtm/m/n/10 Add meth-ods \T1/txtt/m/n/10 Comm.Create_errhandler()\T1/qtm/m /n/10 , \T1/txtt/m/n/10 Win.Create_errhandler()\T1/qtm/m/n/10 , and \T1/txtt/m/ n/10 File. [251] [252] [253] [254] [255] Underfull \hbox (badness 10000) in paragraph at lines 40229--40234 \T1/txtt/m/n/10 port_name, info=INFO_NULL) \T1/qtm/m/n/10 and \T1/txtt/m/n/10 U npublish_name(service_name, port_name, Underfull \hbox (badness 10000) in paragraph at lines 40241--40245 []\T1/qtm/m/n/10 Change sig-na-ture of \T1/txtt/m/n/10 Win.Lock()\T1/qtm/m/n/10 . The new sig-na-ture is \T1/txtt/m/n/10 Win.Lock(rank, [256] [257] [258] [259] [260] LaTeX Warning: Reference `mpi4py:module-mpi4py' on page 261 undefined on input line 40658. LaTeX Warning: Reference `mpi4py.bench:module-mpi4py.bench' on page 261 undefin ed on input line 40659. LaTeX Warning: Reference `mpi4py.futures:module-mpi4py.futures' on page 261 und efined on input line 40660. LaTeX Warning: Reference `reference/mpi4py.MPI:module-mpi4py.MPI' on page 261 u ndefined on input line 40661. LaTeX Warning: Reference `mpi4py.run:module-mpi4py.run' on page 261 undefined o n input line 40662. LaTeX Warning: Reference `mpi4py.typing:module-mpi4py.typing' on page 261 undef ined on input line 40663. LaTeX Warning: Reference `mpi4py.util:module-mpi4py.util' on page 261 undefined on input line 40664. LaTeX Warning: Reference `mpi4py.util.dtlib:module-mpi4py.util.dtlib' on page 2 61 undefined on input line 40665. LaTeX Warning: Reference `mpi4py.util.pkl5:module-mpi4py.util.pkl5' on page 261 undefined on input line 40666. LaTeX Warning: Reference `mpi4py.util.pool:module-mpi4py.util.pool' on page 261 undefined on input line 40667. LaTeX Warning: Reference `mpi4py.util.sync:module-mpi4py.util.sync' on page 261 undefined on input line 40668. [261] No file mpi4py.ind. Package longtable Warning: Table widths have changed. Rerun LaTeX. (./mpi4py.aux) LaTeX Warning: There were undefined references. LaTeX Warning: Label(s) may have changed. Rerun to get cross-references right. Package rerunfilecheck Warning: File `mpi4py.out' has changed. (rerunfilecheck) Rerun to get outlines right (rerunfilecheck) or use package `bookmark'. ) (see the transcript file for additional information) Output written on mpi4py.pdf (261 pages, 805517 bytes). Transcript written on mpi4py.log. Latexmk: Missing input file 'mpi4py.toc' (or dependence on it) from following: No file mpi4py.toc. Latexmk: Missing input file 'mpi4py.ind' (or dependence on it) from following: No file mpi4py.ind. Latexmk: Getting log file 'mpi4py.log' Latexmk: Examining 'mpi4py.fls' Latexmk: Examining 'mpi4py.log' Latexmk: Index file 'mpi4py.idx' was written Latexmk: References changed. Latexmk: References changed. Latexmk: Log file says output to 'mpi4py.pdf' Have index file 'mpi4py.idx', mpi4py.ind mpi4py Latexmk: applying rule 'makeindex mpi4py.idx'... Rule 'makeindex mpi4py.idx': Reasons for rerun Category 'other': Rerun of 'makeindex mpi4py.idx' forced or previously required: Reason or flag: 'Initial set up of rule' ------------ Run number 1 of rule 'makeindex mpi4py.idx' ------------ ------------ Running 'makeindex -s python.ist -o "mpi4py.ind" "mpi4py.idx"' ------------ This is makeindex, version 2.17 [TeX Live 2025/dev] (kpathsea + Thai support). Scanning style file ./python.ist.......done (7 attributes redefined, 0 ignored). Scanning input file mpi4py.idx.....done (1251 entries accepted, 0 rejected). Sorting entries.............done (13596 comparisons). Generating output file mpi4py.ind.....done (1321 lines written, 0 warnings). Output written in mpi4py.ind. Transcript written in mpi4py.ilg. Latexmk: applying rule 'pdflatex'... Rule 'pdflatex': Reasons for rerun Changed files or newly in use/created: mpi4py.aux mpi4py.ind mpi4py.out mpi4py.toc ------------ Run number 2 of rule 'pdflatex' ------------ ------------ Running 'pdflatex -recorder "mpi4py.tex"' ------------ This is pdfTeX, Version 3.141592653-2.6-1.40.26 (TeX Live 2025/dev/Debian) (preloaded format=pdflatex) restricted \write18 enabled. entering extended mode (./mpi4py.tex LaTeX2e <2024-06-01> patch level 2 L3 programming layer <2024-08-16> (./sphinxhowto.cls Document Class: sphinxhowto 2019/12/01 v2.3.0 Document class (Sphinx howto) (/usr/share/texlive/texmf-dist/tex/latex/base/article.cls Document Class: article 2024/02/08 v1.4n Standard LaTeX document class (/usr/share/texlive/texmf-dist/tex/latex/base/size10.clo))) (/usr/share/texlive/texmf-dist/tex/latex/base/inputenc.sty) (/usr/share/texlive/texmf-dist/tex/latex/cmap/cmap.sty) (/usr/share/texlive/texmf-dist/tex/latex/base/fontenc.sty<>) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsmath.sty For additional information on amsmath, use the `?' option. (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amstext.sty (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsgen.sty)) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsbsy.sty) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsopn.sty)) (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/amssymb.sty (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/amsfonts.sty)) (/usr/share/texlive/texmf-dist/tex/generic/babel/babel.sty (/usr/share/texlive/texmf-dist/tex/generic/babel/txtbabel.def) (/usr/share/texlive/texmf-dist/tex/generic/babel-english/english.ldf)) (/usr/share/texlive/texmf-dist/tex/generic/babel/locale/en/babel-english.tex) (/usr/share/texmf/tex/latex/tex-gyre/tgtermes.sty (/usr/share/texlive/texmf-dist/tex/latex/kvoptions/kvoptions.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics/keyval.sty) (/usr/share/texlive/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty) (/usr/share/texlive/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty))) (/usr/share/texmf/tex/latex/tex-gyre/tgheros.sty) (/usr/share/texlive/texmf-dist/tex/latex/fncychap/fncychap.sty) (./sphinx.sty (/usr/share/texlive/texmf-dist/tex/latex/xcolor/xcolor.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics-cfg/color.cfg) (/usr/share/texlive/texmf-dist/tex/latex/graphics-def/pdftex.def) (/usr/share/texlive/texmf-dist/tex/latex/graphics/mathcolor.ltx)) (./sphinxoptionshyperref.sty) (./sphinxoptionsgeometry.sty) (/usr/share/texlive/texmf-dist/tex/latex/base/textcomp.sty) (/usr/share/texlive/texmf-dist/tex/latex/float/float.sty) (/usr/share/texlive/texmf-dist/tex/latex/wrapfig/wrapfig.sty) (/usr/share/texlive/texmf-dist/tex/latex/capt-of/capt-of.sty) (/usr/share/texlive/texmf-dist/tex/latex/tools/multicol.sty) (/usr/share/texlive/texmf-dist/tex/latex/graphics/graphicx.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics/graphics.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics/trig.sty) (/usr/share/texlive/texmf-dist/tex/latex/graphics-cfg/graphics.cfg))) (./sphinxlatexgraphics.sty) (./sphinxpackageboxes.sty (/usr/share/texlive/texmf-dist/tex/latex/pict2e/pict2e.sty (/usr/share/texlive/texmf-dist/tex/latex/pict2e/pict2e.cfg) (/usr/share/texlive/texmf-dist/tex/latex/pict2e/p2e-pdftex.def)) (/usr/share/texlive/texmf-dist/tex/latex/ellipse/ellipse.sty)) (./sphinxlatexadmonitions.sty (/usr/share/texlive/texmf-dist/tex/latex/framed/framed.sty)) (./sphinxlatexliterals.sty (/usr/share/texlive/texmf-dist/tex/latex/fancyvrb/fancyvrb.sty) (/usr/share/texlive/texmf-dist/tex/latex/base/alltt.sty) (/usr/share/texlive/texmf-dist/tex/latex/upquote/upquote.sty) (/usr/share/texlive/texmf-dist/tex/latex/needspace/needspace.sty)) (./sphinxlatexshadowbox.sty) (./sphinxlatexcontainers.sty) (./sphinxhighlight.sty) (./sphinxlatextables.sty (/usr/share/texlive/texmf-dist/tex/latex/tabulary/tabulary.sty (/usr/share/texlive/texmf-dist/tex/latex/tools/array.sty)) (/usr/share/texlive/texmf-dist/tex/latex/tools/longtable.sty) (/usr/share/texlive/texmf-dist/tex/latex/varwidth/varwidth.sty) (/usr/share/texlive/texmf-dist/tex/latex/colortbl/colortbl.sty) (/usr/share/texlive/texmf-dist/tex/latex/booktabs/booktabs.sty)) (./sphinxlatexnumfig.sty) (./sphinxlatexlists.sty) (./sphinxpackagefootnote.sty ) (./sphinxlatexindbibtoc.sty (/usr/share/texlive/texmf-dist/tex/latex/base/makeidx.sty)) (./sphinxlatexstylepage.sty (/usr/share/texlive/texmf-dist/tex/latex/parskip/parskip.sty (/usr/share/texlive/texmf-dist/tex/latex/parskip/parskip-2001-04-09.sty)) (/usr/share/texlive/texmf-dist/tex/latex/fancyhdr/fancyhdr.sty)) (./sphinxlatexstyleheadings.sty (/usr/share/texlive/texmf-dist/tex/latex/titlesec/titlesec.sty)) (./sphinxlatexstyletext.sty) (./sphinxlatexobjects.sty)) (/usr/share/texlive/texmf-dist/tex/latex/geometry/geometry.sty (/usr/share/texlive/texmf-dist/tex/generic/iftex/ifvtex.sty (/usr/share/texlive/texmf-dist/tex/generic/iftex/iftex.sty))) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/hyperref.sty (/usr/share/texlive/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty) (/usr/share/texlive/texmf-dist/tex/generic/pdfescape/pdfescape.sty (/usr/share/texlive/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty (/usr/share/texlive/texmf-dist/tex/generic/infwarerr/infwarerr.sty))) (/usr/share/texlive/texmf-dist/tex/latex/hycolor/hycolor.sty) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/nameref.sty (/usr/share/texlive/texmf-dist/tex/latex/refcount/refcount.sty) (/usr/share/texlive/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty)) (/usr/share/texlive/texmf-dist/tex/latex/etoolbox/etoolbox.sty) (/usr/share/texlive/texmf-dist/tex/generic/stringenc/stringenc.sty) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/pd1enc.def) (/usr/share/texlive/texmf-dist/tex/generic/intcalc/intcalc.sty) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/puenc.def) (/usr/share/texlive/texmf-dist/tex/latex/url/url.sty) (/usr/share/texlive/texmf-dist/tex/generic/bitset/bitset.sty (/usr/share/texlive/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty)) (/usr/share/texlive/texmf-dist/tex/latex/base/atbegshi-ltx.sty)) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/hpdftex.def (/usr/share/texlive/texmf-dist/tex/latex/base/atveryend-ltx.sty) (/usr/share/texlive/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty (/usr/share/texlive/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty))) (/usr/share/texlive/texmf-dist/tex/latex/hypcap/hypcap.sty (/usr/share/texlive/texmf-dist/tex/latex/letltxmacro/letltxmacro.sty)) (./sphinxmessages.sty) Writing index file mpi4py.idx (/usr/share/texmf/tex/latex/tex-gyre/t1qtm.fd) (/usr/share/texlive/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def) LaTeX Warning: Unused global option(s): [a4]. (./mpi4py.aux) (/usr/share/texlive/texmf-dist/tex/context/base/mkii/supp-pdf.mkii [Loading MPS to PDF converter (version 2006.09.02).] ) (/usr/share/texlive/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty (/usr/share/texlive/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg)) *geometry* driver: auto-detecting *geometry* detected driver: pdftex (./mpi4py.out) (./mpi4py.out) (/usr/share/texmf/tex/latex/tex-gyre/t1qhv.fd)<><><><> (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/umsa.fd) (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/umsb.fd) (./mpi4py.toc [1{/var/lib/texmf/fonts/map/pdftex/updmap/pdftex.map}{/usr/share/texmf/fonts/en c/dvips/tex-gyre/q-ec.enc}] [2]) [3] (/usr/share/texmf/tex/latex/tex-gyre/ts1qtm.fd) [4{/usr/share/texmf/fonts/enc/dvips/tex-gyre/q-ts1.enc}] (/usr/share/texlive/texmf-dist/tex/latex/txfonts/t1txtt.fd) [5] [6] [7] [8] [9] [10] [11] Underfull \hbox (badness 6691) in paragraph at lines 942--951 []\T1/qtm/m/it/10 MPI for Python \T1/qtm/m/n/10 uses the \T1/qtm/b/n/10 high-es t [][]\T1/qtm/m/n/10 pro-to-col ver-sion[][] avail-able in the Python run-time (see the [12] (/usr/share/texlive/texmf-dist/tex/latex/txfonts/ts1txtt.fd) [13] [14] Overfull \vbox (1.8739pt too high) detected at line 1202 [15] [16] [17] [18] [19] [20] [21] [22] [23{/usr/share/texlive/texmf-dist/fonts/enc/dvips/base/8r.enc}] [24] [25] [26] Package tabulary Warning: No suitable columns! on input line 2462. [27] Package tabulary Warning: No suitable columns! on input line 2531. Package tabulary Warning: No suitable columns! on input line 2551. Package tabulary Warning: No suitable columns! on input line 2571. Package tabulary Warning: No suitable columns! on input line 2598. [28] Package tabulary Warning: No suitable columns! on input line 2625. Package tabulary Warning: No suitable columns! on input line 2655. Underfull \hbox (badness 10000) in paragraph at lines 2717--2717 []|\T1/qtm/m/n/10 In-di-cate whether this thread called [][]\T1/txtt/m/sl/10 In it[][] \T1/qtm/m/n/10 or Package tabulary Warning: No suitable columns! on input line 2717. Underfull \hbox (badness 10000) in paragraph at lines 2717--2717 []|\T1/qtm/m/n/10 In-di-cate whether this thread called [][]\T1/txtt/m/sl/10 In it[][] \T1/qtm/m/n/10 or Package tabulary Warning: No suitable columns! on input line 2744. Package tabulary Warning: No suitable columns! on input line 2778. [29] Package tabulary Warning: No suitable columns! on input line 2805. Package tabulary Warning: No suitable columns! on input line 2874. Package tabulary Warning: No suitable columns! on input line 2922. Package tabulary Warning: No suitable columns! on input line 2991. [30] Package tabulary Warning: No suitable columns! on input line 3011. Underfull \hbox (badness 10000) in paragraph at lines 3098--3101 []|\T1/qtm/m/n/10 Con-stant \T1/txtt/m/n/10 BUFFER_AUTOMATIC \T1/qtm/m/n/10 of type [31] [32] [33] [34] [35] Package tabulary Warning: No suitable columns! on input line 5137. [36] [37] [38] Underfull \hbox (badness 10000) in paragraph at lines 5445--5447 [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T 1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/ n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ]]] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl/10 Suppor tsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsDLPack[][] Underfull \hbox (badness 6188) in paragraph at lines 5445--5447 \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsCAI[][]\T1/qtm/m/n/10 , [][]\T1/t xtt/m/sl/10 Datatype[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 str[][]\T1/qtm/m/ n/10 ] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl/10 Supp ortsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsDLPack[][] \T1/qtm /m/n/10 | [][]\T1/txtt/m/sl/10 SupportsCAI[][]\T1/qtm/m/n/10 , Underfull \hbox (badness 6268) in paragraph at lines 5445--5447 [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T 1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/ n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ]], [][]\T1/txtt/m/sl/10 Datatype[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 str [][]\T1/qtm/m/n/10 ] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txt t/m/sl/10 SupportsBuffer[][] \T1/qtm/m/n/10 | Underfull \hbox (badness 8151) in paragraph at lines 5445--5447 [][]\T1/txtt/m/sl/10 SupportsDLPack[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 S upportsCAI[][]\T1/qtm/m/n/10 , [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [ [][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/n/10 Sequence[ ][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/tx tt/m/sl/10 Datatype[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 str[][]\T1/qtm/m/n /10 ] Underfull \hbox (badness 10000) in paragraph at lines 5445--5447 \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl /10 BottomType[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 None[][]\T1/qtm/m/n/10 , [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][] \T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/ m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/ 10 ]], [][]\T1/txtt/m/sl/10 Datatype[][]\T1/qtm/m/n/10 ] | Underfull \hbox (badness 10000) in paragraph at lines 5482--5484 []\T1/qtm/m/n/10 alias of [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/ txtt/m/sl/10 SupportsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsD LPack[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsCAI[][]\T1/qtm/m/n/10 , [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl/10 Datatype[ ][]\T1/qtm/m/n/10 ]] Underfull \hbox (badness 10000) in paragraph at lines 5482--5484 \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl /10 SupportsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsDLPack[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsCAI[][]\T1/qtm/m/n/10 , [][]\T1/ txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/ n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], Underfull \hbox (badness 10000) in paragraph at lines 5482--5484 [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][ ]\T1/qtm/m/n/10 ]], [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txt t/m/sl/10 Datatype[][]\T1/qtm/m/n/10 ]] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/ m/n/10 [[][]\T1/txtt/m/sl/10 SupportsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m /sl/10 SupportsDLPack[][] \T1/qtm/m/n/10 | [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] Package tabulary Warning: No suitable columns! on input line 9423. Package tabulary Warning: No suitable columns! on input line 9626. [71] [72] Underfull \hbox (badness 5533) in paragraph at lines 9787--9787 []|\T1/qtm/m/n/10 Re-turn a pro-cess ranks for data shift-ing with Package tabulary Warning: No suitable columns! on input line 9787. Underfull \hbox (badness 5533) in paragraph at lines 9787--9787 []|\T1/qtm/m/n/10 Re-turn a pro-cess ranks for data shift-ing with Package tabulary Warning: No suitable columns! on input line 9842. [73] [74] [75] [76] [77] Package tabulary Warning: No suitable columns! on input line 11183. [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] Underfull \hbox (badness 10000) in paragraph at lines 15543--15545 []|[][]\T1/txtt/m/sl/10 Create_hindexed_block[][]\T1/qtm/m/n/10 (blocklength, d is-place- Underfull \hbox (badness 10000) in paragraph at lines 15564--15566 []|[][]\T1/txtt/m/sl/10 Create_indexed_block[][]\T1/qtm/m/n/10 (blocklength, di s-place- Underfull \hbox (badness 5203) in paragraph at lines 15742--15745 []|\T1/qtm/m/n/10 Un-pack from con-tigu-ous mem-ory ac-cord-ing to Underfull \hbox (badness 5203) in paragraph at lines 15749--15752 []|\T1/qtm/m/n/10 Un-pack from con-tigu-ous mem-ory ac-cord-ing to [106] [107] Package tabulary Warning: No suitable columns! on input line 15928. [108] [109] [110] [111] [112] [113] [114] [115] [116] Package tabulary Warning: No suitable columns! on input line 17362. [117] Package tabulary Warning: No suitable columns! on input line 17486. Package tabulary Warning: No suitable columns! on input line 17506. [118] [119] [120] Package tabulary Warning: No suitable columns! on input line 18195. [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] Package tabulary Warning: No suitable columns! on input line 19958. Package tabulary Warning: No suitable columns! on input line 20027. [132] [133] Package tabulary Warning: No suitable columns! on input line 20278. [134] Package tabulary Warning: No suitable columns! on input line 20538. Package tabulary Warning: No suitable columns! on input line 20572. [135] [136] [137] [138] Package tabulary Warning: No suitable columns! on input line 21252. Package tabulary Warning: No suitable columns! on input line 21272. [139] [140] [141] [142] Package tabulary Warning: No suitable columns! on input line 21828. Package tabulary Warning: No suitable columns! on input line 21855. [143] Underfull \hbox (badness 10000) in paragraph at lines 22187--22187 []|[][]\T1/txtt/m/sl/10 Create_dist_graph[][]\T1/qtm/m/n/10 (sources, de-grees, des-ti-na- Package tabulary Warning: No suitable columns! on input line 22187. Underfull \hbox (badness 10000) in paragraph at lines 22187--22187 []|[][]\T1/txtt/m/sl/10 Create_dist_graph[][]\T1/qtm/m/n/10 (sources, de-grees, des-ti-na- [144] [145] [146] [147] [148] [149] Package tabulary Warning: No suitable columns! on input line 23061. Package tabulary Warning: No suitable columns! on input line 23081. [150] [151] [152] Package tabulary Warning: No suitable columns! on input line 23516. Package tabulary Warning: No suitable columns! on input line 23550. [153] [154] Package tabulary Warning: No suitable columns! on input line 23836. Package tabulary Warning: No suitable columns! on input line 23863. [155] Package tabulary Warning: No suitable columns! on input line 24078. [156] [157] [158] Package tabulary Warning: No suitable columns! on input line 24524. [159] [160] [161] [162] [163] Package tabulary Warning: No suitable columns! on input line 25505. [164] Package tabulary Warning: No suitable columns! on input line 25525. [165] [166] [167] Package tabulary Warning: No suitable columns! on input line 26076. Package tabulary Warning: No suitable columns! on input line 26124. [168] [169] [170] Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_allgather_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_allgatherv_init[][]\T1/qtm/m/n/10 (sendbuf, re cvbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoall_init[][]\T1/qtm/m/n/10 (sendbuf, recv buf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoallv_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoallw_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Package tabulary Warning: No suitable columns! on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_allgather_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_allgatherv_init[][]\T1/qtm/m/n/10 (sendbuf, re cvbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoall_init[][]\T1/qtm/m/n/10 (sendbuf, recv buf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoallv_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoallw_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Package tabulary Warning: No suitable columns! on input line 26691. [171] [172] [173] [174] [175] [176] Package tabulary Warning: No suitable columns! on input line 27777. [177] [178] [179] [180] [181] [182] [183] [184] [185] Package tabulary Warning: No suitable columns! on input line 29311. [186] Package tabulary Warning: No suitable columns! on input line 29366. [187] [188] Package tabulary Warning: No suitable columns! on input line 29658. Package tabulary Warning: No suitable columns! on input line 29728. Package tabulary Warning: No suitable columns! on input line 29762. [189] Underfull \hbox (badness 10000) in paragraph at lines 30068--30071 []|\T1/qtm/m/n/10 In-di-cate whether this thread called [][]\T1/txtt/m/sl/10 In it[][] \T1/qtm/m/n/10 or [190] [191] [192] [193] [194] [195] [196] [197] [198] Underfull \hbox (badness 10000) in paragraph at lines 32195--32198 []|\T1/qtm/m/n/10 Con-stant \T1/txtt/m/n/10 BUFFER_AUTOMATIC \T1/qtm/m/n/10 of type [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212] [213] [214] [215] [216] [217] [218] [219] [220] [221] [222] [223] [224] [225] [226] [227] [228] [229] [230] [231] [232] [233] [234] [235] [236] [237] [238] [239] [240] [241] [242] [243] [244] [245] [246] [247] [248] [249] [250] [251] [252] Underfull \hbox (badness 10000) in paragraph at lines 39629--39632 []\T1/qtm/m/n/10 Add meth-ods \T1/txtt/m/n/10 Comm.Create_errhandler()\T1/qtm/m /n/10 , \T1/txtt/m/n/10 Win.Create_errhandler()\T1/qtm/m/n/10 , and \T1/txtt/m/ n/10 File. [253] [254] [255] [256] [257] Underfull \hbox (badness 10000) in paragraph at lines 40229--40234 \T1/txtt/m/n/10 port_name, info=INFO_NULL) \T1/qtm/m/n/10 and \T1/txtt/m/n/10 U npublish_name(service_name, port_name, Underfull \hbox (badness 10000) in paragraph at lines 40241--40245 []\T1/qtm/m/n/10 Change sig-na-ture of \T1/txtt/m/n/10 Win.Lock()\T1/qtm/m/n/10 . The new sig-na-ture is \T1/txtt/m/n/10 Win.Lock(rank, [258] [259] [260] [261] [262] [263] (./mpi4py.ind Underfull \hbox (badness 10000) in paragraph at lines 24--26 []\T1/txtt/m/n/10 __new__() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.BufferAu tomaticType static [264] Underfull \hbox (badness 10000) in paragraph at lines 160--161 []\T1/txtt/m/n/10 Call_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Session method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 203--204 []\T1/txtt/m/n/10 COMM_TYPE_RESOURCE_GUIDED \T1/qtm/m/n/10 (\T1/qtm/m/it/10 in mod-ule [265] Underfull \hbox (badness 10000) in paragraph at lines 243--244 []\T1/txtt/m/n/10 Create_contiguous() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Datatype Underfull \hbox (badness 10000) in paragraph at lines 245--246 []\T1/txtt/m/n/10 Create_dist_graph() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Intracomm Underfull \hbox (badness 10000) in paragraph at lines 246--248 []\T1/txtt/m/n/10 Create_dist_graph_adjacent() Underfull \hbox (badness 10000) in paragraph at lines 250--251 []\T1/txtt/m/n/10 Create_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Comm class Underfull \hbox (badness 10000) in paragraph at lines 251--252 []\T1/txtt/m/n/10 Create_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.File class Underfull \hbox (badness 10000) in paragraph at lines 252--253 []\T1/txtt/m/n/10 Create_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Session class Underfull \hbox (badness 10000) in paragraph at lines 253--254 []\T1/txtt/m/n/10 Create_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Win class Underfull \hbox (badness 10000) in paragraph at lines 258--259 []\T1/txtt/m/n/10 Create_f90_real() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Datatype class Underfull \hbox (badness 10000) in paragraph at lines 263--265 []\T1/txtt/m/n/10 Create_from_session_pset() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mp i4py.MPI.Group Underfull \hbox (badness 10000) in paragraph at lines 269--270 []\T1/txtt/m/n/10 Create_hindexed_block() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4p y.MPI.Datatype Underfull \hbox (badness 10000) in paragraph at lines 270--271 []\T1/txtt/m/n/10 Create_hvector() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.D atatype method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 271--272 []\T1/txtt/m/n/10 Create_indexed() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.D atatype method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 272--273 []\T1/txtt/m/n/10 Create_indexed_block() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py .MPI.Datatype Underfull \hbox (badness 10000) in paragraph at lines 273--274 []\T1/txtt/m/n/10 Create_intercomm() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI .Intracomm Underfull \hbox (badness 10000) in paragraph at lines 275--276 []\T1/txtt/m/n/10 Create_keyval() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.Da tatype class Underfull \hbox (badness 10000) in paragraph at lines 277--278 []\T1/txtt/m/n/10 Create_resized() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.D atatype method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 278--279 []\T1/txtt/m/n/10 Create_struct() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.Da tatype class [266] [267] Underfull \hbox (badness 10000) in paragraph at lines 420--421 []\T1/txtt/m/n/10 ERR_UNSUPPORTED_OPERATION \T1/qtm/m/n/10 (\T1/qtm/m/it/10 in mod-ule [268] Underfull \hbox (badness 5161) in paragraph at lines 524--525 []\T1/txtt/m/n/10 fromhandle() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.Messa ge class method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 560--562 []\T1/txtt/m/n/10 Get_dist_neighbors_count() Underfull \hbox (badness 6725) in paragraph at lines 571--572 []\T1/txtt/m/n/10 Get_error_code() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.E xception method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 590--591 []\T1/txtt/m/n/10 Get_neighbors_count() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Graphcomm Underfull \hbox (badness 10000) in paragraph at lines 602--603 []\T1/txtt/m/n/10 Get_remote_group() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI .Intercomm [269] Underfull \hbox (badness 10000) in paragraph at lines 612--613 []\T1/txtt/m/n/10 Get_status_all() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.R equest class Underfull \hbox (badness 10000) in paragraph at lines 613--614 []\T1/txtt/m/n/10 get_status_all() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.R equest class Underfull \hbox (badness 10000) in paragraph at lines 614--616 []\T1/txtt/m/n/10 get_status_all() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.util. pkl5.Request class Underfull \hbox (badness 10000) in paragraph at lines 616--617 []\T1/txtt/m/n/10 Get_status_any() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.R equest class Underfull \hbox (badness 10000) in paragraph at lines 617--618 []\T1/txtt/m/n/10 get_status_any() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.R equest class Underfull \hbox (badness 10000) in paragraph at lines 618--619 []\T1/txtt/m/n/10 Get_status_some() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Request class Underfull \hbox (badness 10000) in paragraph at lines 619--620 []\T1/txtt/m/n/10 get_status_some() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Request class Underfull \hbox (badness 10000) in paragraph at lines 626--627 []\T1/txtt/m/n/10 Get_value_index() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Datatype class Underfull \hbox (badness 10000) in paragraph at lines 692--693 []\T1/txtt/m/n/10 Ineighbor_allgather() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 693--694 []\T1/txtt/m/n/10 Ineighbor_allgatherv() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py .MPI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 694--695 []\T1/txtt/m/n/10 Ineighbor_alltoall() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 695--696 []\T1/txtt/m/n/10 Ineighbor_alltoallv() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 696--697 []\T1/txtt/m/n/10 Ineighbor_alltoallw() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Topocomm [270] Underfull \hbox (badness 10000) in paragraph at lines 747--748 []\T1/txtt/m/n/10 Ireduce_scatter_block() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4p y.MPI.Comm Underfull \hbox (badness 10000) in paragraph at lines 777--778 []\T1/txtt/m/n/10 istarmap_unordered() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.u til.pool.Pool [271] Underfull \hbox (badness 10000) in paragraph at lines 835--836 []\T1/txtt/m/n/10 MAX_LIBRARY_VERSION_STRING \T1/qtm/m/n/10 (\T1/qtm/m/it/10 in mod-ule [272] Underfull \hbox (badness 10000) in paragraph at lines 936--937 []\T1/txtt/m/n/10 Neighbor_allgather() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 937--938 []\T1/txtt/m/n/10 neighbor_allgather() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 938--940 []\T1/txtt/m/n/10 Neighbor_allgather_init() Underfull \hbox (badness 10000) in paragraph at lines 940--941 []\T1/txtt/m/n/10 Neighbor_allgatherv() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 941--943 []\T1/txtt/m/n/10 Neighbor_allgatherv_init() Underfull \hbox (badness 10000) in paragraph at lines 943--944 []\T1/txtt/m/n/10 Neighbor_alltoall() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 944--945 []\T1/txtt/m/n/10 neighbor_alltoall() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 946--947 []\T1/txtt/m/n/10 Neighbor_alltoallv() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 947--949 []\T1/txtt/m/n/10 Neighbor_alltoallv_init() Underfull \hbox (badness 10000) in paragraph at lines 949--950 []\T1/txtt/m/n/10 Neighbor_alltoallw() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 950--952 []\T1/txtt/m/n/10 Neighbor_alltoallw_init() Underfull \hbox (badness 10000) in paragraph at lines 959--960 []\T1/txtt/m/n/10 num_workers \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.futures.MP IPoolExecutor at- Underfull \hbox (badness 10000) in paragraph at lines 980--981 []\T1/txtt/m/n/10 Pack_external_size() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Datatype [273] Underfull \hbox (badness 10000) in paragraph at lines 1046--1047 []\T1/txtt/m/n/10 Read_ordered_begin() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.File method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 1069--1070 []\T1/txtt/m/n/10 Reduce_scatter_block() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py .MPI.Comm Underfull \hbox (badness 10000) in paragraph at lines 1071--1072 []\T1/txtt/m/n/10 Reduce_scatter_init() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Comm [274] [275] Underfull \hbox (badness 10000) in paragraph at lines 1311--1312 []\T1/txtt/m/n/10 Write_at_all_begin() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.File method\T1/qtm/m/n/10 ), [276]) (./mpi4py.aux) LaTeX Warning: Label(s) may have changed. Rerun to get cross-references right. Package rerunfilecheck Warning: File `mpi4py.out' has changed. (rerunfilecheck) Rerun to get outlines right (rerunfilecheck) or use package `bookmark'. ) (see the transcript file for additional information) Output written on mpi4py.pdf (276 pages, 1023255 bytes). Transcript written on mpi4py.log. Latexmk: Getting log file 'mpi4py.log' Latexmk: Examining 'mpi4py.fls' Latexmk: Examining 'mpi4py.log' Latexmk: Index file 'mpi4py.idx' was written Latexmk: References changed. Latexmk: References changed. Latexmk: Log file says output to 'mpi4py.pdf' Have index file 'mpi4py.idx', mpi4py.ind mpi4py Latexmk: applying rule 'makeindex mpi4py.idx'... Rule 'makeindex mpi4py.idx': Reasons for rerun Changed files or newly in use/created: mpi4py.idx ------------ Run number 2 of rule 'makeindex mpi4py.idx' ------------ ------------ Running 'makeindex -s python.ist -o "mpi4py.ind" "mpi4py.idx"' ------------ This is makeindex, version 2.17 [TeX Live 2025/dev] (kpathsea + Thai support). Scanning style file ./python.ist.......done (7 attributes redefined, 0 ignored). Scanning input file mpi4py.idx.....done (1251 entries accepted, 0 rejected). Sorting entries.............done (13596 comparisons). Generating output file mpi4py.ind.....done (1321 lines written, 0 warnings). Output written in mpi4py.ind. Transcript written in mpi4py.ilg. Latexmk: applying rule 'pdflatex'... Rule 'pdflatex': Reasons for rerun Changed files or newly in use/created: mpi4py.aux mpi4py.ind mpi4py.out mpi4py.toc ------------ Run number 3 of rule 'pdflatex' ------------ ------------ Running 'pdflatex -recorder "mpi4py.tex"' ------------ This is pdfTeX, Version 3.141592653-2.6-1.40.26 (TeX Live 2025/dev/Debian) (preloaded format=pdflatex) restricted \write18 enabled. entering extended mode (./mpi4py.tex LaTeX2e <2024-06-01> patch level 2 L3 programming layer <2024-08-16> (./sphinxhowto.cls Document Class: sphinxhowto 2019/12/01 v2.3.0 Document class (Sphinx howto) (/usr/share/texlive/texmf-dist/tex/latex/base/article.cls Document Class: article 2024/02/08 v1.4n Standard LaTeX document class (/usr/share/texlive/texmf-dist/tex/latex/base/size10.clo))) (/usr/share/texlive/texmf-dist/tex/latex/base/inputenc.sty) (/usr/share/texlive/texmf-dist/tex/latex/cmap/cmap.sty) (/usr/share/texlive/texmf-dist/tex/latex/base/fontenc.sty<>) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsmath.sty For additional information on amsmath, use the `?' option. (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amstext.sty (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsgen.sty)) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsbsy.sty) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsopn.sty)) (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/amssymb.sty (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/amsfonts.sty)) (/usr/share/texlive/texmf-dist/tex/generic/babel/babel.sty (/usr/share/texlive/texmf-dist/tex/generic/babel/txtbabel.def) (/usr/share/texlive/texmf-dist/tex/generic/babel-english/english.ldf)) (/usr/share/texlive/texmf-dist/tex/generic/babel/locale/en/babel-english.tex) (/usr/share/texmf/tex/latex/tex-gyre/tgtermes.sty (/usr/share/texlive/texmf-dist/tex/latex/kvoptions/kvoptions.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics/keyval.sty) (/usr/share/texlive/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty) (/usr/share/texlive/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty))) (/usr/share/texmf/tex/latex/tex-gyre/tgheros.sty) (/usr/share/texlive/texmf-dist/tex/latex/fncychap/fncychap.sty) (./sphinx.sty (/usr/share/texlive/texmf-dist/tex/latex/xcolor/xcolor.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics-cfg/color.cfg) (/usr/share/texlive/texmf-dist/tex/latex/graphics-def/pdftex.def) (/usr/share/texlive/texmf-dist/tex/latex/graphics/mathcolor.ltx)) (./sphinxoptionshyperref.sty) (./sphinxoptionsgeometry.sty) (/usr/share/texlive/texmf-dist/tex/latex/base/textcomp.sty) (/usr/share/texlive/texmf-dist/tex/latex/float/float.sty) (/usr/share/texlive/texmf-dist/tex/latex/wrapfig/wrapfig.sty) (/usr/share/texlive/texmf-dist/tex/latex/capt-of/capt-of.sty) (/usr/share/texlive/texmf-dist/tex/latex/tools/multicol.sty) (/usr/share/texlive/texmf-dist/tex/latex/graphics/graphicx.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics/graphics.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics/trig.sty) (/usr/share/texlive/texmf-dist/tex/latex/graphics-cfg/graphics.cfg))) (./sphinxlatexgraphics.sty) (./sphinxpackageboxes.sty (/usr/share/texlive/texmf-dist/tex/latex/pict2e/pict2e.sty (/usr/share/texlive/texmf-dist/tex/latex/pict2e/pict2e.cfg) (/usr/share/texlive/texmf-dist/tex/latex/pict2e/p2e-pdftex.def)) (/usr/share/texlive/texmf-dist/tex/latex/ellipse/ellipse.sty)) (./sphinxlatexadmonitions.sty (/usr/share/texlive/texmf-dist/tex/latex/framed/framed.sty)) (./sphinxlatexliterals.sty (/usr/share/texlive/texmf-dist/tex/latex/fancyvrb/fancyvrb.sty) (/usr/share/texlive/texmf-dist/tex/latex/base/alltt.sty) (/usr/share/texlive/texmf-dist/tex/latex/upquote/upquote.sty) (/usr/share/texlive/texmf-dist/tex/latex/needspace/needspace.sty)) (./sphinxlatexshadowbox.sty) (./sphinxlatexcontainers.sty) (./sphinxhighlight.sty) (./sphinxlatextables.sty (/usr/share/texlive/texmf-dist/tex/latex/tabulary/tabulary.sty (/usr/share/texlive/texmf-dist/tex/latex/tools/array.sty)) (/usr/share/texlive/texmf-dist/tex/latex/tools/longtable.sty) (/usr/share/texlive/texmf-dist/tex/latex/varwidth/varwidth.sty) (/usr/share/texlive/texmf-dist/tex/latex/colortbl/colortbl.sty) (/usr/share/texlive/texmf-dist/tex/latex/booktabs/booktabs.sty)) (./sphinxlatexnumfig.sty) (./sphinxlatexlists.sty) (./sphinxpackagefootnote.sty ) (./sphinxlatexindbibtoc.sty (/usr/share/texlive/texmf-dist/tex/latex/base/makeidx.sty)) (./sphinxlatexstylepage.sty (/usr/share/texlive/texmf-dist/tex/latex/parskip/parskip.sty (/usr/share/texlive/texmf-dist/tex/latex/parskip/parskip-2001-04-09.sty)) (/usr/share/texlive/texmf-dist/tex/latex/fancyhdr/fancyhdr.sty)) (./sphinxlatexstyleheadings.sty (/usr/share/texlive/texmf-dist/tex/latex/titlesec/titlesec.sty)) (./sphinxlatexstyletext.sty) (./sphinxlatexobjects.sty)) (/usr/share/texlive/texmf-dist/tex/latex/geometry/geometry.sty (/usr/share/texlive/texmf-dist/tex/generic/iftex/ifvtex.sty (/usr/share/texlive/texmf-dist/tex/generic/iftex/iftex.sty))) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/hyperref.sty (/usr/share/texlive/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty) (/usr/share/texlive/texmf-dist/tex/generic/pdfescape/pdfescape.sty (/usr/share/texlive/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty (/usr/share/texlive/texmf-dist/tex/generic/infwarerr/infwarerr.sty))) (/usr/share/texlive/texmf-dist/tex/latex/hycolor/hycolor.sty) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/nameref.sty (/usr/share/texlive/texmf-dist/tex/latex/refcount/refcount.sty) (/usr/share/texlive/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty)) (/usr/share/texlive/texmf-dist/tex/latex/etoolbox/etoolbox.sty) (/usr/share/texlive/texmf-dist/tex/generic/stringenc/stringenc.sty) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/pd1enc.def) (/usr/share/texlive/texmf-dist/tex/generic/intcalc/intcalc.sty) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/puenc.def) (/usr/share/texlive/texmf-dist/tex/latex/url/url.sty) (/usr/share/texlive/texmf-dist/tex/generic/bitset/bitset.sty (/usr/share/texlive/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty)) (/usr/share/texlive/texmf-dist/tex/latex/base/atbegshi-ltx.sty)) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/hpdftex.def (/usr/share/texlive/texmf-dist/tex/latex/base/atveryend-ltx.sty) (/usr/share/texlive/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty (/usr/share/texlive/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty))) (/usr/share/texlive/texmf-dist/tex/latex/hypcap/hypcap.sty (/usr/share/texlive/texmf-dist/tex/latex/letltxmacro/letltxmacro.sty)) (./sphinxmessages.sty) Writing index file mpi4py.idx (/usr/share/texmf/tex/latex/tex-gyre/t1qtm.fd) (/usr/share/texlive/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def) LaTeX Warning: Unused global option(s): [a4]. (./mpi4py.aux) (/usr/share/texlive/texmf-dist/tex/context/base/mkii/supp-pdf.mkii [Loading MPS to PDF converter (version 2006.09.02).] ) (/usr/share/texlive/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty (/usr/share/texlive/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg)) *geometry* driver: auto-detecting *geometry* detected driver: pdftex (./mpi4py.out) (./mpi4py.out) (/usr/share/texmf/tex/latex/tex-gyre/t1qhv.fd)<><><><> (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/umsa.fd) (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/umsb.fd) (./mpi4py.toc [1{/var/lib/texmf/fonts/map/pdftex/updmap/pdftex.map}{/usr/share/texmf/fonts/en c/dvips/tex-gyre/q-ec.enc}] [2]) [3] (/usr/share/texmf/tex/latex/tex-gyre/ts1qtm.fd) [4{/usr/share/texmf/fonts/enc/dvips/tex-gyre/q-ts1.enc}] [5] (/usr/share/texlive/texmf-dist/tex/latex/txfonts/t1txtt.fd) [6] [7] [8] [9] [10] [11] Underfull \hbox (badness 6691) in paragraph at lines 942--951 []\T1/qtm/m/it/10 MPI for Python \T1/qtm/m/n/10 uses the \T1/qtm/b/n/10 high-es t [][]\T1/qtm/m/n/10 pro-to-col ver-sion[][] avail-able in the Python run-time (see the [12] (/usr/share/texlive/texmf-dist/tex/latex/txfonts/ts1txtt.fd) [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24{/usr/share/texlive/texmf-dist/fonts/enc/dvips/base/8r.enc}] [25] [26] [27] Package tabulary Warning: No suitable columns! on input line 2462. Package tabulary Warning: No suitable columns! on input line 2531. Package tabulary Warning: No suitable columns! on input line 2551. [28] Package tabulary Warning: No suitable columns! on input line 2571. Package tabulary Warning: No suitable columns! on input line 2598. Package tabulary Warning: No suitable columns! on input line 2625. Package tabulary Warning: No suitable columns! on input line 2655. Underfull \hbox (badness 10000) in paragraph at lines 2717--2717 []|\T1/qtm/m/n/10 In-di-cate whether this thread called [][]\T1/txtt/m/sl/10 In it[][] \T1/qtm/m/n/10 or Package tabulary Warning: No suitable columns! on input line 2717. Underfull \hbox (badness 10000) in paragraph at lines 2717--2717 []|\T1/qtm/m/n/10 In-di-cate whether this thread called [][]\T1/txtt/m/sl/10 In it[][] \T1/qtm/m/n/10 or [29] Package tabulary Warning: No suitable columns! on input line 2744. Package tabulary Warning: No suitable columns! on input line 2778. Package tabulary Warning: No suitable columns! on input line 2805. Package tabulary Warning: No suitable columns! on input line 2874. Package tabulary Warning: No suitable columns! on input line 2922. [30] Package tabulary Warning: No suitable columns! on input line 2991. Package tabulary Warning: No suitable columns! on input line 3011. Underfull \hbox (badness 10000) in paragraph at lines 3098--3101 []|\T1/qtm/m/n/10 Con-stant \T1/txtt/m/n/10 BUFFER_AUTOMATIC \T1/qtm/m/n/10 of type [31] [32] [33] [34] [35] [36] Package tabulary Warning: No suitable columns! on input line 5137. [37] [38] Underfull \hbox (badness 10000) in paragraph at lines 5445--5447 [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T 1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/ n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ]]] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl/10 Suppor tsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsDLPack[][] Underfull \hbox (badness 6188) in paragraph at lines 5445--5447 \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsCAI[][]\T1/qtm/m/n/10 , [][]\T1/t xtt/m/sl/10 Datatype[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 str[][]\T1/qtm/m/ n/10 ] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl/10 Supp ortsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsDLPack[][] \T1/qtm /m/n/10 | [][]\T1/txtt/m/sl/10 SupportsCAI[][]\T1/qtm/m/n/10 , Underfull \hbox (badness 6268) in paragraph at lines 5445--5447 [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T 1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/ n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ]], [][]\T1/txtt/m/sl/10 Datatype[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 str [][]\T1/qtm/m/n/10 ] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txt t/m/sl/10 SupportsBuffer[][] \T1/qtm/m/n/10 | Underfull \hbox (badness 8151) in paragraph at lines 5445--5447 [][]\T1/txtt/m/sl/10 SupportsDLPack[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 S upportsCAI[][]\T1/qtm/m/n/10 , [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [ [][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/n/10 Sequence[ ][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/tx tt/m/sl/10 Datatype[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 str[][]\T1/qtm/m/n /10 ] Underfull \hbox (badness 10000) in paragraph at lines 5445--5447 \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl /10 BottomType[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 None[][]\T1/qtm/m/n/10 , [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][] \T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/ m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/ 10 ]], [][]\T1/txtt/m/sl/10 Datatype[][]\T1/qtm/m/n/10 ] | Underfull \hbox (badness 10000) in paragraph at lines 5482--5484 []\T1/qtm/m/n/10 alias of [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/ txtt/m/sl/10 SupportsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsD LPack[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsCAI[][]\T1/qtm/m/n/10 , [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl/10 Datatype[ ][]\T1/qtm/m/n/10 ]] Underfull \hbox (badness 10000) in paragraph at lines 5482--5484 \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl /10 SupportsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsDLPack[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsCAI[][]\T1/qtm/m/n/10 , [][]\T1/ txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/ n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], Underfull \hbox (badness 10000) in paragraph at lines 5482--5484 [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][ ]\T1/qtm/m/n/10 ]], [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txt t/m/sl/10 Datatype[][]\T1/qtm/m/n/10 ]] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/ m/n/10 [[][]\T1/txtt/m/sl/10 SupportsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m /sl/10 SupportsDLPack[][] \T1/qtm/m/n/10 | [39] [40] [41] [42] [43] Overfull \vbox (0.54955pt too high) detected at line 6049 [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] Package tabulary Warning: No suitable columns! on input line 9423. [71] Package tabulary Warning: No suitable columns! on input line 9626. [72] Underfull \hbox (badness 5533) in paragraph at lines 9787--9787 []|\T1/qtm/m/n/10 Re-turn a pro-cess ranks for data shift-ing with Package tabulary Warning: No suitable columns! on input line 9787. Underfull \hbox (badness 5533) in paragraph at lines 9787--9787 []|\T1/qtm/m/n/10 Re-turn a pro-cess ranks for data shift-ing with Package tabulary Warning: No suitable columns! on input line 9842. [73] [74] [75] [76] [77] Package tabulary Warning: No suitable columns! on input line 11183. [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] Underfull \hbox (badness 10000) in paragraph at lines 15543--15545 []|[][]\T1/txtt/m/sl/10 Create_hindexed_block[][]\T1/qtm/m/n/10 (blocklength, d is-place- Underfull \hbox (badness 10000) in paragraph at lines 15564--15566 []|[][]\T1/txtt/m/sl/10 Create_indexed_block[][]\T1/qtm/m/n/10 (blocklength, di s-place- Underfull \hbox (badness 5203) in paragraph at lines 15742--15745 []|\T1/qtm/m/n/10 Un-pack from con-tigu-ous mem-ory ac-cord-ing to Underfull \hbox (badness 5203) in paragraph at lines 15749--15752 []|\T1/qtm/m/n/10 Un-pack from con-tigu-ous mem-ory ac-cord-ing to [106] [107] Package tabulary Warning: No suitable columns! on input line 15928. [108] [109] [110] [111] [112] [113] [114] [115] [116] Package tabulary Warning: No suitable columns! on input line 17362. [117] Package tabulary Warning: No suitable columns! on input line 17486. Package tabulary Warning: No suitable columns! on input line 17506. [118] [119] [120] Package tabulary Warning: No suitable columns! on input line 18195. [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] Package tabulary Warning: No suitable columns! on input line 19958. Package tabulary Warning: No suitable columns! on input line 20027. [132] [133] Package tabulary Warning: No suitable columns! on input line 20278. [134] Package tabulary Warning: No suitable columns! on input line 20538. Package tabulary Warning: No suitable columns! on input line 20572. [135] [136] [137] [138] Package tabulary Warning: No suitable columns! on input line 21252. Package tabulary Warning: No suitable columns! on input line 21272. [139] [140] [141] [142] Package tabulary Warning: No suitable columns! on input line 21828. Package tabulary Warning: No suitable columns! on input line 21855. [143] Underfull \hbox (badness 10000) in paragraph at lines 22187--22187 []|[][]\T1/txtt/m/sl/10 Create_dist_graph[][]\T1/qtm/m/n/10 (sources, de-grees, des-ti-na- Package tabulary Warning: No suitable columns! on input line 22187. Underfull \hbox (badness 10000) in paragraph at lines 22187--22187 []|[][]\T1/txtt/m/sl/10 Create_dist_graph[][]\T1/qtm/m/n/10 (sources, de-grees, des-ti-na- [144] [145] [146] [147] [148] [149] Package tabulary Warning: No suitable columns! on input line 23061. Package tabulary Warning: No suitable columns! on input line 23081. [150] [151] [152] Package tabulary Warning: No suitable columns! on input line 23516. Package tabulary Warning: No suitable columns! on input line 23550. [153] [154] Package tabulary Warning: No suitable columns! on input line 23836. Package tabulary Warning: No suitable columns! on input line 23863. [155] Package tabulary Warning: No suitable columns! on input line 24078. [156] [157] [158] Package tabulary Warning: No suitable columns! on input line 24524. [159] [160] [161] [162] [163] Package tabulary Warning: No suitable columns! on input line 25505. [164] Package tabulary Warning: No suitable columns! on input line 25525. [165] [166] [167] Package tabulary Warning: No suitable columns! on input line 26076. Package tabulary Warning: No suitable columns! on input line 26124. [168] [169] [170] Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_allgather_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_allgatherv_init[][]\T1/qtm/m/n/10 (sendbuf, re cvbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoall_init[][]\T1/qtm/m/n/10 (sendbuf, recv buf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoallv_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoallw_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Package tabulary Warning: No suitable columns! on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_allgather_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_allgatherv_init[][]\T1/qtm/m/n/10 (sendbuf, re cvbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoall_init[][]\T1/qtm/m/n/10 (sendbuf, recv buf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoallv_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoallw_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Package tabulary Warning: No suitable columns! on input line 26691. [171] [172] [173] [174] [175] [176] Package tabulary Warning: No suitable columns! on input line 27777. [177] [178] [179] [180] [181] [182] [183] [184] [185] Package tabulary Warning: No suitable columns! on input line 29311. [186] Package tabulary Warning: No suitable columns! on input line 29366. [187] [188] Package tabulary Warning: No suitable columns! on input line 29658. Package tabulary Warning: No suitable columns! on input line 29728. Package tabulary Warning: No suitable columns! on input line 29762. [189] Underfull \hbox (badness 10000) in paragraph at lines 30068--30071 []|\T1/qtm/m/n/10 In-di-cate whether this thread called [][]\T1/txtt/m/sl/10 In it[][] \T1/qtm/m/n/10 or [190] [191] [192] [193] [194] [195] [196] [197] [198] Underfull \hbox (badness 10000) in paragraph at lines 32195--32198 []|\T1/qtm/m/n/10 Con-stant \T1/txtt/m/n/10 BUFFER_AUTOMATIC \T1/qtm/m/n/10 of type [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212] [213] [214] [215] [216] [217] [218] [219] [220] [221] [222] [223] [224] [225] [226] [227] [228] [229] [230] [231] [232] [233] [234] [235] [236] [237] [238] [239] [240] [241] [242] [243] [244] [245] [246] [247] [248] [249] [250] [251] [252] Underfull \hbox (badness 10000) in paragraph at lines 39629--39632 []\T1/qtm/m/n/10 Add meth-ods \T1/txtt/m/n/10 Comm.Create_errhandler()\T1/qtm/m /n/10 , \T1/txtt/m/n/10 Win.Create_errhandler()\T1/qtm/m/n/10 , and \T1/txtt/m/ n/10 File. [253] [254] [255] [256] [257] Underfull \hbox (badness 10000) in paragraph at lines 40229--40234 \T1/txtt/m/n/10 port_name, info=INFO_NULL) \T1/qtm/m/n/10 and \T1/txtt/m/n/10 U npublish_name(service_name, port_name, Underfull \hbox (badness 10000) in paragraph at lines 40241--40245 []\T1/qtm/m/n/10 Change sig-na-ture of \T1/txtt/m/n/10 Win.Lock()\T1/qtm/m/n/10 . The new sig-na-ture is \T1/txtt/m/n/10 Win.Lock(rank, [258] [259] [260] [261] [262] [263] (./mpi4py.ind Underfull \hbox (badness 10000) in paragraph at lines 24--26 []\T1/txtt/m/n/10 __new__() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.BufferAu tomaticType static [264] Underfull \hbox (badness 10000) in paragraph at lines 160--161 []\T1/txtt/m/n/10 Call_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Session method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 203--204 []\T1/txtt/m/n/10 COMM_TYPE_RESOURCE_GUIDED \T1/qtm/m/n/10 (\T1/qtm/m/it/10 in mod-ule [265] Underfull \hbox (badness 10000) in paragraph at lines 243--244 []\T1/txtt/m/n/10 Create_contiguous() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Datatype Underfull \hbox (badness 10000) in paragraph at lines 245--246 []\T1/txtt/m/n/10 Create_dist_graph() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Intracomm Underfull \hbox (badness 10000) in paragraph at lines 246--248 []\T1/txtt/m/n/10 Create_dist_graph_adjacent() Underfull \hbox (badness 10000) in paragraph at lines 250--251 []\T1/txtt/m/n/10 Create_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Comm class Underfull \hbox (badness 10000) in paragraph at lines 251--252 []\T1/txtt/m/n/10 Create_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.File class Underfull \hbox (badness 10000) in paragraph at lines 252--253 []\T1/txtt/m/n/10 Create_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Session class Underfull \hbox (badness 10000) in paragraph at lines 253--254 []\T1/txtt/m/n/10 Create_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Win class Underfull \hbox (badness 10000) in paragraph at lines 258--259 []\T1/txtt/m/n/10 Create_f90_real() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Datatype class Underfull \hbox (badness 10000) in paragraph at lines 263--265 []\T1/txtt/m/n/10 Create_from_session_pset() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mp i4py.MPI.Group Underfull \hbox (badness 10000) in paragraph at lines 269--270 []\T1/txtt/m/n/10 Create_hindexed_block() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4p y.MPI.Datatype Underfull \hbox (badness 10000) in paragraph at lines 270--271 []\T1/txtt/m/n/10 Create_hvector() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.D atatype method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 271--272 []\T1/txtt/m/n/10 Create_indexed() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.D atatype method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 272--273 []\T1/txtt/m/n/10 Create_indexed_block() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py .MPI.Datatype Underfull \hbox (badness 10000) in paragraph at lines 273--274 []\T1/txtt/m/n/10 Create_intercomm() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI .Intracomm Underfull \hbox (badness 10000) in paragraph at lines 275--276 []\T1/txtt/m/n/10 Create_keyval() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.Da tatype class Underfull \hbox (badness 10000) in paragraph at lines 277--278 []\T1/txtt/m/n/10 Create_resized() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.D atatype method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 278--279 []\T1/txtt/m/n/10 Create_struct() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.Da tatype class [266] [267] Underfull \hbox (badness 10000) in paragraph at lines 420--421 []\T1/txtt/m/n/10 ERR_UNSUPPORTED_OPERATION \T1/qtm/m/n/10 (\T1/qtm/m/it/10 in mod-ule [268] Underfull \hbox (badness 5161) in paragraph at lines 524--525 []\T1/txtt/m/n/10 fromhandle() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.Messa ge class method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 560--562 []\T1/txtt/m/n/10 Get_dist_neighbors_count() Underfull \hbox (badness 6725) in paragraph at lines 571--572 []\T1/txtt/m/n/10 Get_error_code() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.E xception method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 590--591 []\T1/txtt/m/n/10 Get_neighbors_count() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Graphcomm Underfull \hbox (badness 10000) in paragraph at lines 602--603 []\T1/txtt/m/n/10 Get_remote_group() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI .Intercomm [269] Underfull \hbox (badness 10000) in paragraph at lines 612--613 []\T1/txtt/m/n/10 Get_status_all() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.R equest class Underfull \hbox (badness 10000) in paragraph at lines 613--614 []\T1/txtt/m/n/10 get_status_all() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.R equest class Underfull \hbox (badness 10000) in paragraph at lines 614--616 []\T1/txtt/m/n/10 get_status_all() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.util. pkl5.Request class Underfull \hbox (badness 10000) in paragraph at lines 616--617 []\T1/txtt/m/n/10 Get_status_any() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.R equest class Underfull \hbox (badness 10000) in paragraph at lines 617--618 []\T1/txtt/m/n/10 get_status_any() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.R equest class Underfull \hbox (badness 10000) in paragraph at lines 618--619 []\T1/txtt/m/n/10 Get_status_some() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Request class Underfull \hbox (badness 10000) in paragraph at lines 619--620 []\T1/txtt/m/n/10 get_status_some() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Request class Underfull \hbox (badness 10000) in paragraph at lines 626--627 []\T1/txtt/m/n/10 Get_value_index() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Datatype class Underfull \hbox (badness 10000) in paragraph at lines 692--693 []\T1/txtt/m/n/10 Ineighbor_allgather() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 693--694 []\T1/txtt/m/n/10 Ineighbor_allgatherv() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py .MPI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 694--695 []\T1/txtt/m/n/10 Ineighbor_alltoall() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 695--696 []\T1/txtt/m/n/10 Ineighbor_alltoallv() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 696--697 []\T1/txtt/m/n/10 Ineighbor_alltoallw() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Topocomm [270] Underfull \hbox (badness 10000) in paragraph at lines 747--748 []\T1/txtt/m/n/10 Ireduce_scatter_block() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4p y.MPI.Comm Underfull \hbox (badness 10000) in paragraph at lines 777--778 []\T1/txtt/m/n/10 istarmap_unordered() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.u til.pool.Pool [271] Underfull \hbox (badness 10000) in paragraph at lines 835--836 []\T1/txtt/m/n/10 MAX_LIBRARY_VERSION_STRING \T1/qtm/m/n/10 (\T1/qtm/m/it/10 in mod-ule [272] Underfull \hbox (badness 10000) in paragraph at lines 936--937 []\T1/txtt/m/n/10 Neighbor_allgather() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 937--938 []\T1/txtt/m/n/10 neighbor_allgather() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 938--940 []\T1/txtt/m/n/10 Neighbor_allgather_init() Underfull \hbox (badness 10000) in paragraph at lines 940--941 []\T1/txtt/m/n/10 Neighbor_allgatherv() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 941--943 []\T1/txtt/m/n/10 Neighbor_allgatherv_init() Underfull \hbox (badness 10000) in paragraph at lines 943--944 []\T1/txtt/m/n/10 Neighbor_alltoall() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 944--945 []\T1/txtt/m/n/10 neighbor_alltoall() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 946--947 []\T1/txtt/m/n/10 Neighbor_alltoallv() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 947--949 []\T1/txtt/m/n/10 Neighbor_alltoallv_init() Underfull \hbox (badness 10000) in paragraph at lines 949--950 []\T1/txtt/m/n/10 Neighbor_alltoallw() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 950--952 []\T1/txtt/m/n/10 Neighbor_alltoallw_init() Underfull \hbox (badness 10000) in paragraph at lines 959--960 []\T1/txtt/m/n/10 num_workers \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.futures.MP IPoolExecutor at- Underfull \hbox (badness 10000) in paragraph at lines 980--981 []\T1/txtt/m/n/10 Pack_external_size() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Datatype [273] Underfull \hbox (badness 10000) in paragraph at lines 1046--1047 []\T1/txtt/m/n/10 Read_ordered_begin() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.File method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 1069--1070 []\T1/txtt/m/n/10 Reduce_scatter_block() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py .MPI.Comm Underfull \hbox (badness 10000) in paragraph at lines 1071--1072 []\T1/txtt/m/n/10 Reduce_scatter_init() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Comm [274] [275] Underfull \hbox (badness 10000) in paragraph at lines 1311--1312 []\T1/txtt/m/n/10 Write_at_all_begin() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.File method\T1/qtm/m/n/10 ), [276]) (./mpi4py.aux) LaTeX Warning: Label(s) may have changed. Rerun to get cross-references right. ) (see the transcript file for additional information) Output written on mpi4py.pdf (276 pages, 1021955 bytes). Transcript written on mpi4py.log. Latexmk: Getting log file 'mpi4py.log' Latexmk: Examining 'mpi4py.fls' Latexmk: Examining 'mpi4py.log' Latexmk: Index file 'mpi4py.idx' was written Latexmk: References changed. Latexmk: Log file says output to 'mpi4py.pdf' Have index file 'mpi4py.idx', mpi4py.ind mpi4py Latexmk: applying rule 'makeindex mpi4py.idx'... Rule 'makeindex mpi4py.idx': Reasons for rerun Changed files or newly in use/created: mpi4py.idx ------------ Run number 3 of rule 'makeindex mpi4py.idx' ------------ ------------ Running 'makeindex -s python.ist -o "mpi4py.ind" "mpi4py.idx"' ------------ This is makeindex, version 2.17 [TeX Live 2025/dev] (kpathsea + Thai support). Scanning style file ./python.ist.......done (7 attributes redefined, 0 ignored). Scanning input file mpi4py.idx.....done (1251 entries accepted, 0 rejected). Sorting entries.............done (13596 comparisons). Generating output file mpi4py.ind.....done (1321 lines written, 0 warnings). Output written in mpi4py.ind. Transcript written in mpi4py.ilg. Latexmk: applying rule 'pdflatex'... Rule 'pdflatex': Reasons for rerun Changed files or newly in use/created: mpi4py.aux mpi4py.ind mpi4py.toc ------------ Run number 4 of rule 'pdflatex' ------------ ------------ Running 'pdflatex -recorder "mpi4py.tex"' ------------ This is pdfTeX, Version 3.141592653-2.6-1.40.26 (TeX Live 2025/dev/Debian) (preloaded format=pdflatex) restricted \write18 enabled. entering extended mode (./mpi4py.tex LaTeX2e <2024-06-01> patch level 2 L3 programming layer <2024-08-16> (./sphinxhowto.cls Document Class: sphinxhowto 2019/12/01 v2.3.0 Document class (Sphinx howto) (/usr/share/texlive/texmf-dist/tex/latex/base/article.cls Document Class: article 2024/02/08 v1.4n Standard LaTeX document class (/usr/share/texlive/texmf-dist/tex/latex/base/size10.clo))) (/usr/share/texlive/texmf-dist/tex/latex/base/inputenc.sty) (/usr/share/texlive/texmf-dist/tex/latex/cmap/cmap.sty) (/usr/share/texlive/texmf-dist/tex/latex/base/fontenc.sty<>) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsmath.sty For additional information on amsmath, use the `?' option. (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amstext.sty (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsgen.sty)) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsbsy.sty) (/usr/share/texlive/texmf-dist/tex/latex/amsmath/amsopn.sty)) (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/amssymb.sty (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/amsfonts.sty)) (/usr/share/texlive/texmf-dist/tex/generic/babel/babel.sty (/usr/share/texlive/texmf-dist/tex/generic/babel/txtbabel.def) (/usr/share/texlive/texmf-dist/tex/generic/babel-english/english.ldf)) (/usr/share/texlive/texmf-dist/tex/generic/babel/locale/en/babel-english.tex) (/usr/share/texmf/tex/latex/tex-gyre/tgtermes.sty (/usr/share/texlive/texmf-dist/tex/latex/kvoptions/kvoptions.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics/keyval.sty) (/usr/share/texlive/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty) (/usr/share/texlive/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty))) (/usr/share/texmf/tex/latex/tex-gyre/tgheros.sty) (/usr/share/texlive/texmf-dist/tex/latex/fncychap/fncychap.sty) (./sphinx.sty (/usr/share/texlive/texmf-dist/tex/latex/xcolor/xcolor.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics-cfg/color.cfg) (/usr/share/texlive/texmf-dist/tex/latex/graphics-def/pdftex.def) (/usr/share/texlive/texmf-dist/tex/latex/graphics/mathcolor.ltx)) (./sphinxoptionshyperref.sty) (./sphinxoptionsgeometry.sty) (/usr/share/texlive/texmf-dist/tex/latex/base/textcomp.sty) (/usr/share/texlive/texmf-dist/tex/latex/float/float.sty) (/usr/share/texlive/texmf-dist/tex/latex/wrapfig/wrapfig.sty) (/usr/share/texlive/texmf-dist/tex/latex/capt-of/capt-of.sty) (/usr/share/texlive/texmf-dist/tex/latex/tools/multicol.sty) (/usr/share/texlive/texmf-dist/tex/latex/graphics/graphicx.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics/graphics.sty (/usr/share/texlive/texmf-dist/tex/latex/graphics/trig.sty) (/usr/share/texlive/texmf-dist/tex/latex/graphics-cfg/graphics.cfg))) (./sphinxlatexgraphics.sty) (./sphinxpackageboxes.sty (/usr/share/texlive/texmf-dist/tex/latex/pict2e/pict2e.sty (/usr/share/texlive/texmf-dist/tex/latex/pict2e/pict2e.cfg) (/usr/share/texlive/texmf-dist/tex/latex/pict2e/p2e-pdftex.def)) (/usr/share/texlive/texmf-dist/tex/latex/ellipse/ellipse.sty)) (./sphinxlatexadmonitions.sty (/usr/share/texlive/texmf-dist/tex/latex/framed/framed.sty)) (./sphinxlatexliterals.sty (/usr/share/texlive/texmf-dist/tex/latex/fancyvrb/fancyvrb.sty) (/usr/share/texlive/texmf-dist/tex/latex/base/alltt.sty) (/usr/share/texlive/texmf-dist/tex/latex/upquote/upquote.sty) (/usr/share/texlive/texmf-dist/tex/latex/needspace/needspace.sty)) (./sphinxlatexshadowbox.sty) (./sphinxlatexcontainers.sty) (./sphinxhighlight.sty) (./sphinxlatextables.sty (/usr/share/texlive/texmf-dist/tex/latex/tabulary/tabulary.sty (/usr/share/texlive/texmf-dist/tex/latex/tools/array.sty)) (/usr/share/texlive/texmf-dist/tex/latex/tools/longtable.sty) (/usr/share/texlive/texmf-dist/tex/latex/varwidth/varwidth.sty) (/usr/share/texlive/texmf-dist/tex/latex/colortbl/colortbl.sty) (/usr/share/texlive/texmf-dist/tex/latex/booktabs/booktabs.sty)) (./sphinxlatexnumfig.sty) (./sphinxlatexlists.sty) (./sphinxpackagefootnote.sty ) (./sphinxlatexindbibtoc.sty (/usr/share/texlive/texmf-dist/tex/latex/base/makeidx.sty)) (./sphinxlatexstylepage.sty (/usr/share/texlive/texmf-dist/tex/latex/parskip/parskip.sty (/usr/share/texlive/texmf-dist/tex/latex/parskip/parskip-2001-04-09.sty)) (/usr/share/texlive/texmf-dist/tex/latex/fancyhdr/fancyhdr.sty)) (./sphinxlatexstyleheadings.sty (/usr/share/texlive/texmf-dist/tex/latex/titlesec/titlesec.sty)) (./sphinxlatexstyletext.sty) (./sphinxlatexobjects.sty)) (/usr/share/texlive/texmf-dist/tex/latex/geometry/geometry.sty (/usr/share/texlive/texmf-dist/tex/generic/iftex/ifvtex.sty (/usr/share/texlive/texmf-dist/tex/generic/iftex/iftex.sty))) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/hyperref.sty (/usr/share/texlive/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty) (/usr/share/texlive/texmf-dist/tex/generic/pdfescape/pdfescape.sty (/usr/share/texlive/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty (/usr/share/texlive/texmf-dist/tex/generic/infwarerr/infwarerr.sty))) (/usr/share/texlive/texmf-dist/tex/latex/hycolor/hycolor.sty) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/nameref.sty (/usr/share/texlive/texmf-dist/tex/latex/refcount/refcount.sty) (/usr/share/texlive/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty)) (/usr/share/texlive/texmf-dist/tex/latex/etoolbox/etoolbox.sty) (/usr/share/texlive/texmf-dist/tex/generic/stringenc/stringenc.sty) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/pd1enc.def) (/usr/share/texlive/texmf-dist/tex/generic/intcalc/intcalc.sty) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/puenc.def) (/usr/share/texlive/texmf-dist/tex/latex/url/url.sty) (/usr/share/texlive/texmf-dist/tex/generic/bitset/bitset.sty (/usr/share/texlive/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty)) (/usr/share/texlive/texmf-dist/tex/latex/base/atbegshi-ltx.sty)) (/usr/share/texlive/texmf-dist/tex/latex/hyperref/hpdftex.def (/usr/share/texlive/texmf-dist/tex/latex/base/atveryend-ltx.sty) (/usr/share/texlive/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty (/usr/share/texlive/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty))) (/usr/share/texlive/texmf-dist/tex/latex/hypcap/hypcap.sty (/usr/share/texlive/texmf-dist/tex/latex/letltxmacro/letltxmacro.sty)) (./sphinxmessages.sty) Writing index file mpi4py.idx (/usr/share/texmf/tex/latex/tex-gyre/t1qtm.fd) (/usr/share/texlive/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def) LaTeX Warning: Unused global option(s): [a4]. (./mpi4py.aux) (/usr/share/texlive/texmf-dist/tex/context/base/mkii/supp-pdf.mkii [Loading MPS to PDF converter (version 2006.09.02).] ) (/usr/share/texlive/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty (/usr/share/texlive/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg)) *geometry* driver: auto-detecting *geometry* detected driver: pdftex (./mpi4py.out) (./mpi4py.out) (/usr/share/texmf/tex/latex/tex-gyre/t1qhv.fd)<><><><> (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/umsa.fd) (/usr/share/texlive/texmf-dist/tex/latex/amsfonts/umsb.fd) (./mpi4py.toc [1{/var/lib/texmf/fonts/map/pdftex/updmap/pdftex.map}{/usr/share/texmf/fonts/en c/dvips/tex-gyre/q-ec.enc}] [2]) [3] (/usr/share/texmf/tex/latex/tex-gyre/ts1qtm.fd) [4{/usr/share/texmf/fonts/enc/dvips/tex-gyre/q-ts1.enc}] [5] (/usr/share/texlive/texmf-dist/tex/latex/txfonts/t1txtt.fd) [6] [7] [8] [9] [10] [11] Underfull \hbox (badness 6691) in paragraph at lines 942--951 []\T1/qtm/m/it/10 MPI for Python \T1/qtm/m/n/10 uses the \T1/qtm/b/n/10 high-es t [][]\T1/qtm/m/n/10 pro-to-col ver-sion[][] avail-able in the Python run-time (see the [12] (/usr/share/texlive/texmf-dist/tex/latex/txfonts/ts1txtt.fd) [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24{/usr/share/texlive/texmf-dist/fonts/enc/dvips/base/8r.enc}] [25] [26] [27] Package tabulary Warning: No suitable columns! on input line 2462. Package tabulary Warning: No suitable columns! on input line 2531. Package tabulary Warning: No suitable columns! on input line 2551. [28] Package tabulary Warning: No suitable columns! on input line 2571. Package tabulary Warning: No suitable columns! on input line 2598. Package tabulary Warning: No suitable columns! on input line 2625. Package tabulary Warning: No suitable columns! on input line 2655. Underfull \hbox (badness 10000) in paragraph at lines 2717--2717 []|\T1/qtm/m/n/10 In-di-cate whether this thread called [][]\T1/txtt/m/sl/10 In it[][] \T1/qtm/m/n/10 or Package tabulary Warning: No suitable columns! on input line 2717. Underfull \hbox (badness 10000) in paragraph at lines 2717--2717 []|\T1/qtm/m/n/10 In-di-cate whether this thread called [][]\T1/txtt/m/sl/10 In it[][] \T1/qtm/m/n/10 or [29] Package tabulary Warning: No suitable columns! on input line 2744. Package tabulary Warning: No suitable columns! on input line 2778. Package tabulary Warning: No suitable columns! on input line 2805. Package tabulary Warning: No suitable columns! on input line 2874. Package tabulary Warning: No suitable columns! on input line 2922. [30] Package tabulary Warning: No suitable columns! on input line 2991. Package tabulary Warning: No suitable columns! on input line 3011. Underfull \hbox (badness 10000) in paragraph at lines 3098--3101 []|\T1/qtm/m/n/10 Con-stant \T1/txtt/m/n/10 BUFFER_AUTOMATIC \T1/qtm/m/n/10 of type [31] [32] [33] [34] [35] [36] Package tabulary Warning: No suitable columns! on input line 5137. [37] [38] Underfull \hbox (badness 10000) in paragraph at lines 5445--5447 [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T 1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/ n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ]]] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl/10 Suppor tsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsDLPack[][] Underfull \hbox (badness 6188) in paragraph at lines 5445--5447 \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsCAI[][]\T1/qtm/m/n/10 , [][]\T1/t xtt/m/sl/10 Datatype[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 str[][]\T1/qtm/m/ n/10 ] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl/10 Supp ortsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsDLPack[][] \T1/qtm /m/n/10 | [][]\T1/txtt/m/sl/10 SupportsCAI[][]\T1/qtm/m/n/10 , Underfull \hbox (badness 6268) in paragraph at lines 5445--5447 [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T 1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/ n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ]], [][]\T1/txtt/m/sl/10 Datatype[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 str [][]\T1/qtm/m/n/10 ] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txt t/m/sl/10 SupportsBuffer[][] \T1/qtm/m/n/10 | Underfull \hbox (badness 8151) in paragraph at lines 5445--5447 [][]\T1/txtt/m/sl/10 SupportsDLPack[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 S upportsCAI[][]\T1/qtm/m/n/10 , [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [ [][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/m/n/10 Sequence[ ][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/tx tt/m/sl/10 Datatype[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 str[][]\T1/qtm/m/n /10 ] Underfull \hbox (badness 10000) in paragraph at lines 5445--5447 \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl /10 BottomType[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 None[][]\T1/qtm/m/n/10 , [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][] \T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], [][]\T1/txtt/ m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/ 10 ]], [][]\T1/txtt/m/sl/10 Datatype[][]\T1/qtm/m/n/10 ] | Underfull \hbox (badness 10000) in paragraph at lines 5482--5484 []\T1/qtm/m/n/10 alias of [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/ txtt/m/sl/10 SupportsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsD LPack[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsCAI[][]\T1/qtm/m/n/10 , [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl/10 Datatype[ ][]\T1/qtm/m/n/10 ]] Underfull \hbox (badness 10000) in paragraph at lines 5482--5484 \T1/qtm/m/n/10 | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/sl /10 SupportsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsDLPack[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m/sl/10 SupportsCAI[][]\T1/qtm/m/n/10 , [][]\T1/ txtt/m/n/10 Tuple[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/ n/10 [[][]\T1/txtt/m/n/10 Integral[][]\T1/qtm/m/n/10 ], Underfull \hbox (badness 10000) in paragraph at lines 5482--5484 [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txtt/m/n/10 Integral[][ ]\T1/qtm/m/n/10 ]], [][]\T1/txtt/m/n/10 Sequence[][]\T1/qtm/m/n/10 [[][]\T1/txt t/m/sl/10 Datatype[][]\T1/qtm/m/n/10 ]] | [][]\T1/txtt/m/n/10 Tuple[][]\T1/qtm/ m/n/10 [[][]\T1/txtt/m/sl/10 SupportsBuffer[][] \T1/qtm/m/n/10 | [][]\T1/txtt/m /sl/10 SupportsDLPack[][] \T1/qtm/m/n/10 | [39] [40] [41] [42] [43] Overfull \vbox (0.54955pt too high) detected at line 6049 [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] Package tabulary Warning: No suitable columns! on input line 9423. [71] Package tabulary Warning: No suitable columns! on input line 9626. [72] Underfull \hbox (badness 5533) in paragraph at lines 9787--9787 []|\T1/qtm/m/n/10 Re-turn a pro-cess ranks for data shift-ing with Package tabulary Warning: No suitable columns! on input line 9787. Underfull \hbox (badness 5533) in paragraph at lines 9787--9787 []|\T1/qtm/m/n/10 Re-turn a pro-cess ranks for data shift-ing with Package tabulary Warning: No suitable columns! on input line 9842. [73] [74] [75] [76] [77] Package tabulary Warning: No suitable columns! on input line 11183. [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] Underfull \hbox (badness 10000) in paragraph at lines 15543--15545 []|[][]\T1/txtt/m/sl/10 Create_hindexed_block[][]\T1/qtm/m/n/10 (blocklength, d is-place- Underfull \hbox (badness 10000) in paragraph at lines 15564--15566 []|[][]\T1/txtt/m/sl/10 Create_indexed_block[][]\T1/qtm/m/n/10 (blocklength, di s-place- Underfull \hbox (badness 5203) in paragraph at lines 15742--15745 []|\T1/qtm/m/n/10 Un-pack from con-tigu-ous mem-ory ac-cord-ing to Underfull \hbox (badness 5203) in paragraph at lines 15749--15752 []|\T1/qtm/m/n/10 Un-pack from con-tigu-ous mem-ory ac-cord-ing to [106] [107] Package tabulary Warning: No suitable columns! on input line 15928. [108] [109] [110] [111] [112] [113] [114] [115] [116] Package tabulary Warning: No suitable columns! on input line 17362. [117] Package tabulary Warning: No suitable columns! on input line 17486. Package tabulary Warning: No suitable columns! on input line 17506. [118] [119] [120] Package tabulary Warning: No suitable columns! on input line 18195. [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] Package tabulary Warning: No suitable columns! on input line 19958. Package tabulary Warning: No suitable columns! on input line 20027. [132] [133] Package tabulary Warning: No suitable columns! on input line 20278. [134] Package tabulary Warning: No suitable columns! on input line 20538. Package tabulary Warning: No suitable columns! on input line 20572. [135] [136] [137] [138] Package tabulary Warning: No suitable columns! on input line 21252. Package tabulary Warning: No suitable columns! on input line 21272. [139] [140] [141] [142] Package tabulary Warning: No suitable columns! on input line 21828. Package tabulary Warning: No suitable columns! on input line 21855. [143] Underfull \hbox (badness 10000) in paragraph at lines 22187--22187 []|[][]\T1/txtt/m/sl/10 Create_dist_graph[][]\T1/qtm/m/n/10 (sources, de-grees, des-ti-na- Package tabulary Warning: No suitable columns! on input line 22187. Underfull \hbox (badness 10000) in paragraph at lines 22187--22187 []|[][]\T1/txtt/m/sl/10 Create_dist_graph[][]\T1/qtm/m/n/10 (sources, de-grees, des-ti-na- [144] [145] [146] [147] [148] [149] Package tabulary Warning: No suitable columns! on input line 23061. Package tabulary Warning: No suitable columns! on input line 23081. [150] [151] [152] Package tabulary Warning: No suitable columns! on input line 23516. Package tabulary Warning: No suitable columns! on input line 23550. [153] [154] Package tabulary Warning: No suitable columns! on input line 23836. Package tabulary Warning: No suitable columns! on input line 23863. [155] Package tabulary Warning: No suitable columns! on input line 24078. [156] [157] [158] Package tabulary Warning: No suitable columns! on input line 24524. [159] [160] [161] [162] [163] Package tabulary Warning: No suitable columns! on input line 25505. [164] Package tabulary Warning: No suitable columns! on input line 25525. [165] [166] [167] Package tabulary Warning: No suitable columns! on input line 26076. Package tabulary Warning: No suitable columns! on input line 26124. [168] [169] [170] Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_allgather_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_allgatherv_init[][]\T1/qtm/m/n/10 (sendbuf, re cvbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoall_init[][]\T1/qtm/m/n/10 (sendbuf, recv buf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoallv_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoallw_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Package tabulary Warning: No suitable columns! on input line 26636. Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_allgather_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_allgatherv_init[][]\T1/qtm/m/n/10 (sendbuf, re cvbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoall_init[][]\T1/qtm/m/n/10 (sendbuf, recv buf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoallv_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Underfull \hbox (badness 10000) in paragraph at lines 26636--26636 []|[][]\T1/txtt/m/sl/10 Neighbor_alltoallw_init[][]\T1/qtm/m/n/10 (sendbuf, rec vbuf[, Package tabulary Warning: No suitable columns! on input line 26691. [171] [172] [173] [174] [175] [176] Package tabulary Warning: No suitable columns! on input line 27777. [177] [178] [179] [180] [181] [182] [183] [184] [185] Package tabulary Warning: No suitable columns! on input line 29311. [186] Package tabulary Warning: No suitable columns! on input line 29366. [187] [188] Package tabulary Warning: No suitable columns! on input line 29658. Package tabulary Warning: No suitable columns! on input line 29728. Package tabulary Warning: No suitable columns! on input line 29762. [189] Underfull \hbox (badness 10000) in paragraph at lines 30068--30071 []|\T1/qtm/m/n/10 In-di-cate whether this thread called [][]\T1/txtt/m/sl/10 In it[][] \T1/qtm/m/n/10 or [190] [191] [192] [193] [194] [195] [196] [197] [198] Underfull \hbox (badness 10000) in paragraph at lines 32195--32198 []|\T1/qtm/m/n/10 Con-stant \T1/txtt/m/n/10 BUFFER_AUTOMATIC \T1/qtm/m/n/10 of type [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212] [213] [214] [215] [216] [217] [218] [219] [220] [221] [222] [223] [224] [225] [226] [227] [228] [229] [230] [231] [232] [233] [234] [235] [236] [237] [238] [239] [240] [241] [242] [243] [244] [245] [246] [247] [248] [249] [250] [251] [252] Underfull \hbox (badness 10000) in paragraph at lines 39629--39632 []\T1/qtm/m/n/10 Add meth-ods \T1/txtt/m/n/10 Comm.Create_errhandler()\T1/qtm/m /n/10 , \T1/txtt/m/n/10 Win.Create_errhandler()\T1/qtm/m/n/10 , and \T1/txtt/m/ n/10 File. [253] [254] [255] [256] [257] Underfull \hbox (badness 10000) in paragraph at lines 40229--40234 \T1/txtt/m/n/10 port_name, info=INFO_NULL) \T1/qtm/m/n/10 and \T1/txtt/m/n/10 U npublish_name(service_name, port_name, Underfull \hbox (badness 10000) in paragraph at lines 40241--40245 []\T1/qtm/m/n/10 Change sig-na-ture of \T1/txtt/m/n/10 Win.Lock()\T1/qtm/m/n/10 . The new sig-na-ture is \T1/txtt/m/n/10 Win.Lock(rank, [258] [259] [260] [261] [262] [263] (./mpi4py.ind Underfull \hbox (badness 10000) in paragraph at lines 24--26 []\T1/txtt/m/n/10 __new__() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.BufferAu tomaticType static [264] Underfull \hbox (badness 10000) in paragraph at lines 160--161 []\T1/txtt/m/n/10 Call_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Session method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 203--204 []\T1/txtt/m/n/10 COMM_TYPE_RESOURCE_GUIDED \T1/qtm/m/n/10 (\T1/qtm/m/it/10 in mod-ule [265] Underfull \hbox (badness 10000) in paragraph at lines 243--244 []\T1/txtt/m/n/10 Create_contiguous() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Datatype Underfull \hbox (badness 10000) in paragraph at lines 245--246 []\T1/txtt/m/n/10 Create_dist_graph() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Intracomm Underfull \hbox (badness 10000) in paragraph at lines 246--248 []\T1/txtt/m/n/10 Create_dist_graph_adjacent() Underfull \hbox (badness 10000) in paragraph at lines 250--251 []\T1/txtt/m/n/10 Create_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Comm class Underfull \hbox (badness 10000) in paragraph at lines 251--252 []\T1/txtt/m/n/10 Create_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.File class Underfull \hbox (badness 10000) in paragraph at lines 252--253 []\T1/txtt/m/n/10 Create_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Session class Underfull \hbox (badness 10000) in paragraph at lines 253--254 []\T1/txtt/m/n/10 Create_errhandler() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Win class Underfull \hbox (badness 10000) in paragraph at lines 258--259 []\T1/txtt/m/n/10 Create_f90_real() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Datatype class Underfull \hbox (badness 10000) in paragraph at lines 263--265 []\T1/txtt/m/n/10 Create_from_session_pset() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mp i4py.MPI.Group Underfull \hbox (badness 10000) in paragraph at lines 269--270 []\T1/txtt/m/n/10 Create_hindexed_block() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4p y.MPI.Datatype Underfull \hbox (badness 10000) in paragraph at lines 270--271 []\T1/txtt/m/n/10 Create_hvector() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.D atatype method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 271--272 []\T1/txtt/m/n/10 Create_indexed() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.D atatype method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 272--273 []\T1/txtt/m/n/10 Create_indexed_block() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py .MPI.Datatype Underfull \hbox (badness 10000) in paragraph at lines 273--274 []\T1/txtt/m/n/10 Create_intercomm() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI .Intracomm Underfull \hbox (badness 10000) in paragraph at lines 275--276 []\T1/txtt/m/n/10 Create_keyval() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.Da tatype class Underfull \hbox (badness 10000) in paragraph at lines 277--278 []\T1/txtt/m/n/10 Create_resized() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.D atatype method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 278--279 []\T1/txtt/m/n/10 Create_struct() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.Da tatype class [266] [267] Underfull \hbox (badness 10000) in paragraph at lines 420--421 []\T1/txtt/m/n/10 ERR_UNSUPPORTED_OPERATION \T1/qtm/m/n/10 (\T1/qtm/m/it/10 in mod-ule [268] Underfull \hbox (badness 5161) in paragraph at lines 524--525 []\T1/txtt/m/n/10 fromhandle() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.Messa ge class method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 560--562 []\T1/txtt/m/n/10 Get_dist_neighbors_count() Underfull \hbox (badness 6725) in paragraph at lines 571--572 []\T1/txtt/m/n/10 Get_error_code() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.E xception method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 590--591 []\T1/txtt/m/n/10 Get_neighbors_count() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Graphcomm Underfull \hbox (badness 10000) in paragraph at lines 602--603 []\T1/txtt/m/n/10 Get_remote_group() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI .Intercomm [269] Underfull \hbox (badness 10000) in paragraph at lines 612--613 []\T1/txtt/m/n/10 Get_status_all() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.R equest class Underfull \hbox (badness 10000) in paragraph at lines 613--614 []\T1/txtt/m/n/10 get_status_all() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.R equest class Underfull \hbox (badness 10000) in paragraph at lines 614--616 []\T1/txtt/m/n/10 get_status_all() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.util. pkl5.Request class Underfull \hbox (badness 10000) in paragraph at lines 616--617 []\T1/txtt/m/n/10 Get_status_any() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.R equest class Underfull \hbox (badness 10000) in paragraph at lines 617--618 []\T1/txtt/m/n/10 get_status_any() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI.R equest class Underfull \hbox (badness 10000) in paragraph at lines 618--619 []\T1/txtt/m/n/10 Get_status_some() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Request class Underfull \hbox (badness 10000) in paragraph at lines 619--620 []\T1/txtt/m/n/10 get_status_some() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Request class Underfull \hbox (badness 10000) in paragraph at lines 626--627 []\T1/txtt/m/n/10 Get_value_index() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MPI. Datatype class Underfull \hbox (badness 10000) in paragraph at lines 692--693 []\T1/txtt/m/n/10 Ineighbor_allgather() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 693--694 []\T1/txtt/m/n/10 Ineighbor_allgatherv() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py .MPI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 694--695 []\T1/txtt/m/n/10 Ineighbor_alltoall() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 695--696 []\T1/txtt/m/n/10 Ineighbor_alltoallv() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 696--697 []\T1/txtt/m/n/10 Ineighbor_alltoallw() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Topocomm [270] Underfull \hbox (badness 10000) in paragraph at lines 747--748 []\T1/txtt/m/n/10 Ireduce_scatter_block() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4p y.MPI.Comm Underfull \hbox (badness 10000) in paragraph at lines 777--778 []\T1/txtt/m/n/10 istarmap_unordered() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.u til.pool.Pool [271] Underfull \hbox (badness 10000) in paragraph at lines 835--836 []\T1/txtt/m/n/10 MAX_LIBRARY_VERSION_STRING \T1/qtm/m/n/10 (\T1/qtm/m/it/10 in mod-ule [272] Underfull \hbox (badness 10000) in paragraph at lines 936--937 []\T1/txtt/m/n/10 Neighbor_allgather() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 937--938 []\T1/txtt/m/n/10 neighbor_allgather() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 938--940 []\T1/txtt/m/n/10 Neighbor_allgather_init() Underfull \hbox (badness 10000) in paragraph at lines 940--941 []\T1/txtt/m/n/10 Neighbor_allgatherv() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 941--943 []\T1/txtt/m/n/10 Neighbor_allgatherv_init() Underfull \hbox (badness 10000) in paragraph at lines 943--944 []\T1/txtt/m/n/10 Neighbor_alltoall() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 944--945 []\T1/txtt/m/n/10 neighbor_alltoall() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.MP I.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 946--947 []\T1/txtt/m/n/10 Neighbor_alltoallv() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 947--949 []\T1/txtt/m/n/10 Neighbor_alltoallv_init() Underfull \hbox (badness 10000) in paragraph at lines 949--950 []\T1/txtt/m/n/10 Neighbor_alltoallw() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Topocomm Underfull \hbox (badness 10000) in paragraph at lines 950--952 []\T1/txtt/m/n/10 Neighbor_alltoallw_init() Underfull \hbox (badness 10000) in paragraph at lines 959--960 []\T1/txtt/m/n/10 num_workers \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.futures.MP IPoolExecutor at- Underfull \hbox (badness 10000) in paragraph at lines 980--981 []\T1/txtt/m/n/10 Pack_external_size() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.Datatype [273] Underfull \hbox (badness 10000) in paragraph at lines 1046--1047 []\T1/txtt/m/n/10 Read_ordered_begin() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.File method\T1/qtm/m/n/10 ), Underfull \hbox (badness 10000) in paragraph at lines 1069--1070 []\T1/txtt/m/n/10 Reduce_scatter_block() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py .MPI.Comm Underfull \hbox (badness 10000) in paragraph at lines 1071--1072 []\T1/txtt/m/n/10 Reduce_scatter_init() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py. MPI.Comm [274] [275] Underfull \hbox (badness 10000) in paragraph at lines 1311--1312 []\T1/txtt/m/n/10 Write_at_all_begin() \T1/qtm/m/n/10 (\T1/qtm/m/it/10 mpi4py.M PI.File method\T1/qtm/m/n/10 ), [276]) (./mpi4py.aux) ) (see the transcript file for additional information) Output written on mpi4py.pdf (276 pages, 1021928 bytes). Transcript written on mpi4py.log. Latexmk: Getting log file 'mpi4py.log' Latexmk: Examining 'mpi4py.fls' Latexmk: Examining 'mpi4py.log' Latexmk: Index file 'mpi4py.idx' was written Latexmk: Log file says output to 'mpi4py.pdf' Have index file 'mpi4py.idx', mpi4py.ind mpi4py Latexmk: All targets (mpi4py.pdf) are up-to-date make[3]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0/docs/source/_build/latex' make[2]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0/docs/source' make[1]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0' debian/rules override_dh_auto_test make[1]: Entering directory '/build/reproducible-path/mpi4py-4.0.0' set -e; \ if [ "yes" = "no" ]; then \ echo Build tests have been disabled; \ else \ for vv in 3.12; do \ echo "I: testing using python$vv"; \ echo "I: running tests with single process"; \ PYTHONPATH=`pybuild -p $vv --print "{build_dir}"` \ GITHUB_ACTIONS=true MPI4PY_TEST_SPAWN=false \ /usr/bin/python$vv /build/reproducible-path/mpi4py-4.0.0/test/runtests.py -v || /bin/false; \ nproc=`nproc`; MIN_PROC=$(( nproc > 2 ? nproc : 2 )); \ NUM_PROC=$(( MIN_PROC > 5 ? 5 : MIN_PROC )); \ echo "I: running tests with MPI ($NUM_PROC processes)"; \ PYTHONPATH=`pybuild -p $vv --print "{build_dir}"` \ GITHUB_ACTIONS=true MPI4PY_TEST_SPAWN=false \ mpirun -v -n $NUM_PROC /usr/bin/python$vv /build/reproducible-path/mpi4py-4.0.0/test/runtests.py -fv || /bin/false; \ done; \ fi I: testing using python3.12 I: running tests with single process [0@virt32a] Python 3.12.6 (/usr/bin/python3.12) [0@virt32a] numpy 1.26.4 (/usr/lib/python3/dist-packages/numpy) [0@virt32a] MPI 4.1 (MPICH 4.2.0) [0@virt32a] mpi4py 4.0.0 (/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py) testAintAdd (test_address.TestAddress.testAintAdd) ... ok testAintDiff (test_address.TestAddress.testAintDiff) ... ok testBottom (test_address.TestAddress.testBottom) ... ok testGetAddress1 (test_address.TestAddress.testGetAddress1) ... ok testGetAddress2 (test_address.TestAddress.testGetAddress2) ... ok testNone (test_address.TestAddress.testNone) ... ok testAttr (test_attributes.TestCommAttrSelf.testAttr) ... ok testAttrCopyDelete (test_attributes.TestCommAttrSelf.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestCommAttrSelf.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestCommAttrSelf.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestCommAttrSelf.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestCommAttrSelf.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestCommAttrSelf.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestCommAttrSelf.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestCommAttrSelf.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestCommAttrSelf.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestCommAttrWorld.testAttr) ... ok testAttrCopyDelete (test_attributes.TestCommAttrWorld.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestCommAttrWorld.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestCommAttrWorld.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestCommAttrWorld.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestCommAttrWorld.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestCommAttrWorld.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestCommAttrWorld.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestCommAttrWorld.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestCommAttrWorld.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestDatatypeAttrBYTE.testAttr) ... ok testAttrCopyDelete (test_attributes.TestDatatypeAttrBYTE.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestDatatypeAttrBYTE.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrBYTE.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrBYTE.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestDatatypeAttrBYTE.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestDatatypeAttrBYTE.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestDatatypeAttrBYTE.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestDatatypeAttrBYTE.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestDatatypeAttrBYTE.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestDatatypeAttrFLOAT.testAttr) ... ok testAttrCopyDelete (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestDatatypeAttrFLOAT.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestDatatypeAttrFLOAT.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestDatatypeAttrINT.testAttr) ... ok testAttrCopyDelete (test_attributes.TestDatatypeAttrINT.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestDatatypeAttrINT.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrINT.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrINT.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestDatatypeAttrINT.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestDatatypeAttrINT.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestDatatypeAttrINT.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestDatatypeAttrINT.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestDatatypeAttrINT.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestWinAttr.testAttr) ... ok testAttrCopyDelete (test_attributes.TestWinAttr.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestWinAttr.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestWinAttr.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestWinAttr.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestWinAttr.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestWinAttr.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestWinAttr.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestWinAttr.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestWinAttr.testAttrNoPythonZero) ... ok testAllocate (test_buffer.TestBuffer.testAllocate) ... ok testAttachBufferReadonly (test_buffer.TestBuffer.testAttachBufferReadonly) ... ok testBuffering (test_buffer.TestBuffer.testBuffering) ... ok testCast (test_buffer.TestBuffer.testCast) ... ok testFromAddress (test_buffer.TestBuffer.testFromAddress) ... ok testFromBufferArrayRO (test_buffer.TestBuffer.testFromBufferArrayRO) ... ok testFromBufferArrayRW (test_buffer.TestBuffer.testFromBufferArrayRW) ... ok testFromBufferBad (test_buffer.TestBuffer.testFromBufferBad) ... ok testFromBufferBytes (test_buffer.TestBuffer.testFromBufferBytes) ... ok testNewArray (test_buffer.TestBuffer.testNewArray) ... ok testNewBad (test_buffer.TestBuffer.testNewBad) ... ok testNewBytearray (test_buffer.TestBuffer.testNewBytearray) ... ok testNewBytes (test_buffer.TestBuffer.testNewBytes) ... ok testNewEmpty (test_buffer.TestBuffer.testNewEmpty) ... ok testSequence (test_buffer.TestBuffer.testSequence) ... ok testToReadonly (test_buffer.TestBuffer.testToReadonly) ... ok testAllgather (test_cco_buf.TestCCOBufInplaceSelf.testAllgather) ... ok testAllreduce (test_cco_buf.TestCCOBufInplaceSelf.testAllreduce) ... ok testExscan (test_cco_buf.TestCCOBufInplaceSelf.testExscan) ... ok testGather (test_cco_buf.TestCCOBufInplaceSelf.testGather) ... ok testReduce (test_cco_buf.TestCCOBufInplaceSelf.testReduce) ... ok testReduceScatter (test_cco_buf.TestCCOBufInplaceSelf.testReduceScatter) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... ok testScan (test_cco_buf.TestCCOBufInplaceSelf.testScan) ... ok testScatter (test_cco_buf.TestCCOBufInplaceSelf.testScatter) ... ok testAllgather (test_cco_buf.TestCCOBufInplaceWorld.testAllgather) ... ok testAllreduce (test_cco_buf.TestCCOBufInplaceWorld.testAllreduce) ... ok testExscan (test_cco_buf.TestCCOBufInplaceWorld.testExscan) ... ok testGather (test_cco_buf.TestCCOBufInplaceWorld.testGather) ... ok testReduce (test_cco_buf.TestCCOBufInplaceWorld.testReduce) ... ok testReduceScatter (test_cco_buf.TestCCOBufInplaceWorld.testReduceScatter) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... ok testScan (test_cco_buf.TestCCOBufInplaceWorld.testScan) ... ok testScatter (test_cco_buf.TestCCOBufInplaceWorld.testScatter) ... ok testAllgather (test_cco_buf.TestCCOBufSelf.testAllgather) ... ok testAllreduce (test_cco_buf.TestCCOBufSelf.testAllreduce) ... ok testAlltoall (test_cco_buf.TestCCOBufSelf.testAlltoall) ... ok testBarrier (test_cco_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufSelf.testBcast) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufSelf.testBcastTypeIndexed) ... ok testExscan (test_cco_buf.TestCCOBufSelf.testExscan) ... ok testGather (test_cco_buf.TestCCOBufSelf.testGather) ... ok testReduce (test_cco_buf.TestCCOBufSelf.testReduce) ... ok testReduceScatter (test_cco_buf.TestCCOBufSelf.testReduceScatter) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufSelf.testReduceScatterBlock) ... ok testScan (test_cco_buf.TestCCOBufSelf.testScan) ... ok testScatter (test_cco_buf.TestCCOBufSelf.testScatter) ... ok testAllgather (test_cco_buf.TestCCOBufSelfDup.testAllgather) ... ok testAllreduce (test_cco_buf.TestCCOBufSelfDup.testAllreduce) ... ok testAlltoall (test_cco_buf.TestCCOBufSelfDup.testAlltoall) ... ok testBarrier (test_cco_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufSelfDup.testBcast) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... ok testExscan (test_cco_buf.TestCCOBufSelfDup.testExscan) ... ok testGather (test_cco_buf.TestCCOBufSelfDup.testGather) ... ok testReduce (test_cco_buf.TestCCOBufSelfDup.testReduce) ... ok testReduceScatter (test_cco_buf.TestCCOBufSelfDup.testReduceScatter) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... ok testScan (test_cco_buf.TestCCOBufSelfDup.testScan) ... ok testScatter (test_cco_buf.TestCCOBufSelfDup.testScatter) ... ok testAllgather (test_cco_buf.TestCCOBufWorld.testAllgather) ... ok testAllreduce (test_cco_buf.TestCCOBufWorld.testAllreduce) ... ok testAlltoall (test_cco_buf.TestCCOBufWorld.testAlltoall) ... ok testBarrier (test_cco_buf.TestCCOBufWorld.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufWorld.testBcast) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufWorld.testBcastTypeIndexed) ... ok testExscan (test_cco_buf.TestCCOBufWorld.testExscan) ... ok testGather (test_cco_buf.TestCCOBufWorld.testGather) ... ok testReduce (test_cco_buf.TestCCOBufWorld.testReduce) ... ok testReduceScatter (test_cco_buf.TestCCOBufWorld.testReduceScatter) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufWorld.testReduceScatterBlock) ... ok testScan (test_cco_buf.TestCCOBufWorld.testScan) ... ok testScatter (test_cco_buf.TestCCOBufWorld.testScatter) ... ok testAllgather (test_cco_buf.TestCCOBufWorldDup.testAllgather) ... ok testAllreduce (test_cco_buf.TestCCOBufWorldDup.testAllreduce) ... ok testAlltoall (test_cco_buf.TestCCOBufWorldDup.testAlltoall) ... ok testBarrier (test_cco_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufWorldDup.testBcast) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... ok testExscan (test_cco_buf.TestCCOBufWorldDup.testExscan) ... ok testGather (test_cco_buf.TestCCOBufWorldDup.testGather) ... ok testReduce (test_cco_buf.TestCCOBufWorldDup.testReduce) ... ok testReduceScatter (test_cco_buf.TestCCOBufWorldDup.testReduceScatter) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... ok testScan (test_cco_buf.TestCCOBufWorldDup.testScan) ... ok testScatter (test_cco_buf.TestCCOBufWorldDup.testScatter) ... ok testReduceLocal (test_cco_buf.TestReduceLocal.testReduceLocal) ... ok testReduceLocalBadCount (test_cco_buf.TestReduceLocal.testReduceLocalBadCount) ... ok testAllgather (test_cco_buf_inter.TestCCOBufInter.testAllgather) ... skipped 'mpi-world-size<2' testAllreduce (test_cco_buf_inter.TestCCOBufInter.testAllreduce) ... skipped 'mpi-world-size<2' testAlltoall (test_cco_buf_inter.TestCCOBufInter.testAlltoall) ... skipped 'mpi-world-size<2' testBarrier (test_cco_buf_inter.TestCCOBufInter.testBarrier) ... skipped 'mpi-world-size<2' testBcast (test_cco_buf_inter.TestCCOBufInter.testBcast) ... skipped 'mpi-world-size<2' testGather (test_cco_buf_inter.TestCCOBufInter.testGather) ... skipped 'mpi-world-size<2' testReduce (test_cco_buf_inter.TestCCOBufInter.testReduce) ... skipped 'mpi-world-size<2' testScatter (test_cco_buf_inter.TestCCOBufInter.testScatter) ... skipped 'mpi-world-size<2' testAllgather (test_cco_buf_inter.TestCCOBufInterDup.testAllgather) ... skipped 'mpi-world-size<2' testAllreduce (test_cco_buf_inter.TestCCOBufInterDup.testAllreduce) ... skipped 'mpi-world-size<2' testAlltoall (test_cco_buf_inter.TestCCOBufInterDup.testAlltoall) ... skipped 'mpi-world-size<2' testBarrier (test_cco_buf_inter.TestCCOBufInterDup.testBarrier) ... skipped 'mpi-world-size<2' testBcast (test_cco_buf_inter.TestCCOBufInterDup.testBcast) ... skipped 'mpi-world-size<2' testGather (test_cco_buf_inter.TestCCOBufInterDup.testGather) ... skipped 'mpi-world-size<2' testReduce (test_cco_buf_inter.TestCCOBufInterDup.testReduce) ... skipped 'mpi-world-size<2' testScatter (test_cco_buf_inter.TestCCOBufInterDup.testScatter) ... skipped 'mpi-world-size<2' testAllgather (test_cco_nb_buf.TestCCOBufInplaceSelf.testAllgather) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufInplaceSelf.testAllreduce) ... ok testExscan (test_cco_nb_buf.TestCCOBufInplaceSelf.testExscan) ... ok testGather (test_cco_nb_buf.TestCCOBufInplaceSelf.testGather) ... ok testReduce (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduce) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduceScatter) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... ok testScan (test_cco_nb_buf.TestCCOBufInplaceSelf.testScan) ... ok testScatter (test_cco_nb_buf.TestCCOBufInplaceSelf.testScatter) ... ok testAllgather (test_cco_nb_buf.TestCCOBufInplaceWorld.testAllgather) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufInplaceWorld.testAllreduce) ... ok testExscan (test_cco_nb_buf.TestCCOBufInplaceWorld.testExscan) ... ok testGather (test_cco_nb_buf.TestCCOBufInplaceWorld.testGather) ... ok testReduce (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduce) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduceScatter) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... ok testScan (test_cco_nb_buf.TestCCOBufInplaceWorld.testScan) ... ok testScatter (test_cco_nb_buf.TestCCOBufInplaceWorld.testScatter) ... ok testAllgather (test_cco_nb_buf.TestCCOBufSelf.testAllgather) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufSelf.testAllreduce) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufSelf.testAlltoall) ... ok testBarrier (test_cco_nb_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufSelf.testBcast) ... ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufSelf.testBcastTypeIndexed) ... ok testExscan (test_cco_nb_buf.TestCCOBufSelf.testExscan) ... ok testGather (test_cco_nb_buf.TestCCOBufSelf.testGather) ... ok testReduce (test_cco_nb_buf.TestCCOBufSelf.testReduce) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufSelf.testReduceScatter) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufSelf.testReduceScatterBlock) ... ok testScan (test_cco_nb_buf.TestCCOBufSelf.testScan) ... ok testScatter (test_cco_nb_buf.TestCCOBufSelf.testScatter) ... ok testAllgather (test_cco_nb_buf.TestCCOBufSelfDup.testAllgather) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufSelfDup.testAllreduce) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufSelfDup.testAlltoall) ... ok testBarrier (test_cco_nb_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufSelfDup.testBcast) ... ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... ok testExscan (test_cco_nb_buf.TestCCOBufSelfDup.testExscan) ... ok testGather (test_cco_nb_buf.TestCCOBufSelfDup.testGather) ... ok testReduce (test_cco_nb_buf.TestCCOBufSelfDup.testReduce) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufSelfDup.testReduceScatter) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... ok testScan (test_cco_nb_buf.TestCCOBufSelfDup.testScan) ... ok testScatter (test_cco_nb_buf.TestCCOBufSelfDup.testScatter) ... ok testAllgather (test_cco_nb_buf.TestCCOBufWorld.testAllgather) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufWorld.testAllreduce) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufWorld.testAlltoall) ... ok testBarrier (test_cco_nb_buf.TestCCOBufWorld.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufWorld.testBcast) ... ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufWorld.testBcastTypeIndexed) ... ok testExscan (test_cco_nb_buf.TestCCOBufWorld.testExscan) ... ok testGather (test_cco_nb_buf.TestCCOBufWorld.testGather) ... ok testReduce (test_cco_nb_buf.TestCCOBufWorld.testReduce) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufWorld.testReduceScatter) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufWorld.testReduceScatterBlock) ... ok testScan (test_cco_nb_buf.TestCCOBufWorld.testScan) ... ok testScatter (test_cco_nb_buf.TestCCOBufWorld.testScatter) ... ok testAllgather (test_cco_nb_buf.TestCCOBufWorldDup.testAllgather) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufWorldDup.testAllreduce) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufWorldDup.testAlltoall) ... ok testBarrier (test_cco_nb_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufWorldDup.testBcast) ... ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... ok testExscan (test_cco_nb_buf.TestCCOBufWorldDup.testExscan) ... ok testGather (test_cco_nb_buf.TestCCOBufWorldDup.testGather) ... ok testReduce (test_cco_nb_buf.TestCCOBufWorldDup.testReduce) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufWorldDup.testReduceScatter) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... ok testScan (test_cco_nb_buf.TestCCOBufWorldDup.testScan) ... ok testScatter (test_cco_nb_buf.TestCCOBufWorldDup.testScatter) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallv) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallw) ... ok testAlltoallw2 (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallv) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallw) ... ok testAlltoallw2 (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... ok testAllgatherv (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv2) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv3) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv) ... ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv2) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv3) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecSelf.testAlltoallw) ... ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecSelf.testAlltoallwBottom) ... ok testGatherv (test_cco_nb_vec.TestCCOVecSelf.testGatherv) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecSelf.testGatherv2) ... ok testGatherv3 (test_cco_nb_vec.TestCCOVecSelf.testGatherv3) ... ok testScatterv (test_cco_nb_vec.TestCCOVecSelf.testScatterv) ... ok testScatterv2 (test_cco_nb_vec.TestCCOVecSelf.testScatterv2) ... ok testScatterv3 (test_cco_nb_vec.TestCCOVecSelf.testScatterv3) ... ok testAllgatherv (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv2) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv3) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv) ... ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv2) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv3) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallw) ... ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... ok testGatherv (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv2) ... ok testGatherv3 (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv3) ... ok testScatterv (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv) ... ok testScatterv2 (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv2) ... ok testScatterv3 (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv3) ... ok testAllgatherv (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv2) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv3) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv) ... ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv2) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv3) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecWorld.testAlltoallw) ... ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecWorld.testAlltoallwBottom) ... ok testGatherv (test_cco_nb_vec.TestCCOVecWorld.testGatherv) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecWorld.testGatherv2) ... ok testGatherv3 (test_cco_nb_vec.TestCCOVecWorld.testGatherv3) ... ok testScatterv (test_cco_nb_vec.TestCCOVecWorld.testScatterv) ... ok testScatterv2 (test_cco_nb_vec.TestCCOVecWorld.testScatterv2) ... ok testScatterv3 (test_cco_nb_vec.TestCCOVecWorld.testScatterv3) ... ok testAllgatherv (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv2) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv3) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv) ... ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv2) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv3) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallw) ... ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... ok testGatherv (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv2) ... ok testGatherv3 (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv3) ... ok testScatterv (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv) ... ok testScatterv2 (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv2) ... ok testScatterv3 (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv3) ... ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoall) ... ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallw) ... ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallwBottom) ... ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoall) ... ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallw) ... ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallwBottom) ... ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoall) ... ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallw) ... ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallwBottom) ... ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoall) ... ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallw) ... ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallwBottom) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjSelf.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjSelf.testNeighborAlltoall) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjSelfDup.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjSelfDup.testNeighborAlltoall) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjWorld.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjWorld.testNeighborAlltoall) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjWorldDup.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjWorldDup.testNeighborAlltoall) ... ok testAllgather (test_cco_obj.TestCCOObjSelf.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjSelf.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjSelf.testAlltoall) ... ok testBarrier (test_cco_obj.TestCCOObjSelf.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjSelf.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjSelf.testExscan) ... ok testGather (test_cco_obj.TestCCOObjSelf.testGather) ... ok testReduce (test_cco_obj.TestCCOObjSelf.testReduce) ... ok testScan (test_cco_obj.TestCCOObjSelf.testScan) ... ok testScatter (test_cco_obj.TestCCOObjSelf.testScatter) ... ok testAllgather (test_cco_obj.TestCCOObjSelfDup.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjSelfDup.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjSelfDup.testAlltoall) ... ok testBarrier (test_cco_obj.TestCCOObjSelfDup.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjSelfDup.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjSelfDup.testExscan) ... ok testGather (test_cco_obj.TestCCOObjSelfDup.testGather) ... ok testReduce (test_cco_obj.TestCCOObjSelfDup.testReduce) ... ok testScan (test_cco_obj.TestCCOObjSelfDup.testScan) ... ok testScatter (test_cco_obj.TestCCOObjSelfDup.testScatter) ... ok testAllgather (test_cco_obj.TestCCOObjWorld.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjWorld.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjWorld.testAlltoall) ... ok testBarrier (test_cco_obj.TestCCOObjWorld.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjWorld.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjWorld.testExscan) ... ok testGather (test_cco_obj.TestCCOObjWorld.testGather) ... ok testReduce (test_cco_obj.TestCCOObjWorld.testReduce) ... ok testScan (test_cco_obj.TestCCOObjWorld.testScan) ... ok testScatter (test_cco_obj.TestCCOObjWorld.testScatter) ... ok testAllgather (test_cco_obj.TestCCOObjWorldDup.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjWorldDup.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjWorldDup.testAlltoall) ... ok testBarrier (test_cco_obj.TestCCOObjWorldDup.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjWorldDup.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjWorldDup.testExscan) ... ok testGather (test_cco_obj.TestCCOObjWorldDup.testGather) ... ok testReduce (test_cco_obj.TestCCOObjWorldDup.testReduce) ... ok testScan (test_cco_obj.TestCCOObjWorldDup.testScan) ... ok testScatter (test_cco_obj.TestCCOObjWorldDup.testScatter) ... ok testAllgather (test_cco_obj_inter.TestCCOObjInter.testAllgather) ... skipped 'mpi-world-size<2' testAllreduce (test_cco_obj_inter.TestCCOObjInter.testAllreduce) ... skipped 'mpi-world-size<2' testAlltoall (test_cco_obj_inter.TestCCOObjInter.testAlltoall) ... skipped 'mpi-world-size<2' testBarrier (test_cco_obj_inter.TestCCOObjInter.testBarrier) ... skipped 'mpi-world-size<2' testBcast (test_cco_obj_inter.TestCCOObjInter.testBcast) ... skipped 'mpi-world-size<2' testGather (test_cco_obj_inter.TestCCOObjInter.testGather) ... skipped 'mpi-world-size<2' testReduce (test_cco_obj_inter.TestCCOObjInter.testReduce) ... skipped 'mpi-world-size<2' testScatter (test_cco_obj_inter.TestCCOObjInter.testScatter) ... skipped 'mpi-world-size<2' testAllgather (test_cco_obj_inter.TestCCOObjInterDup.testAllgather) ... skipped 'mpi-world-size<2' testAllreduce (test_cco_obj_inter.TestCCOObjInterDup.testAllreduce) ... skipped 'mpi-world-size<2' testAlltoall (test_cco_obj_inter.TestCCOObjInterDup.testAlltoall) ... skipped 'mpi-world-size<2' testBarrier (test_cco_obj_inter.TestCCOObjInterDup.testBarrier) ... skipped 'mpi-world-size<2' testBcast (test_cco_obj_inter.TestCCOObjInterDup.testBcast) ... skipped 'mpi-world-size<2' testGather (test_cco_obj_inter.TestCCOObjInterDup.testGather) ... skipped 'mpi-world-size<2' testReduce (test_cco_obj_inter.TestCCOObjInterDup.testReduce) ... skipped 'mpi-world-size<2' testScatter (test_cco_obj_inter.TestCCOObjInterDup.testScatter) ... skipped 'mpi-world-size<2' testAllgather (test_cco_obj_inter.TestCCOObjInterDupDup.testAllgather) ... skipped 'mpi-world-size<2' testAllreduce (test_cco_obj_inter.TestCCOObjInterDupDup.testAllreduce) ... skipped 'mpi-world-size<2' testAlltoall (test_cco_obj_inter.TestCCOObjInterDupDup.testAlltoall) ... skipped 'mpi-world-size<2' testBarrier (test_cco_obj_inter.TestCCOObjInterDupDup.testBarrier) ... skipped 'mpi-world-size<2' testBcast (test_cco_obj_inter.TestCCOObjInterDupDup.testBcast) ... skipped 'mpi-world-size<2' testGather (test_cco_obj_inter.TestCCOObjInterDupDup.testGather) ... skipped 'mpi-world-size<2' testReduce (test_cco_obj_inter.TestCCOObjInterDupDup.testReduce) ... skipped 'mpi-world-size<2' testScatter (test_cco_obj_inter.TestCCOObjInterDupDup.testScatter) ... skipped 'mpi-world-size<2' testAllgather (test_cco_pr_buf.TestCCOBufInplaceSelf.testAllgather) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufInplaceSelf.testAllreduce) ... ok testExscan (test_cco_pr_buf.TestCCOBufInplaceSelf.testExscan) ... ok testGather (test_cco_pr_buf.TestCCOBufInplaceSelf.testGather) ... ok testReduce (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduce) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduceScatter) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... ok testScan (test_cco_pr_buf.TestCCOBufInplaceSelf.testScan) ... ok testScatter (test_cco_pr_buf.TestCCOBufInplaceSelf.testScatter) ... ok testAllgather (test_cco_pr_buf.TestCCOBufInplaceWorld.testAllgather) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufInplaceWorld.testAllreduce) ... ok testExscan (test_cco_pr_buf.TestCCOBufInplaceWorld.testExscan) ... ok testGather (test_cco_pr_buf.TestCCOBufInplaceWorld.testGather) ... ok testReduce (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduce) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduceScatter) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... ok testScan (test_cco_pr_buf.TestCCOBufInplaceWorld.testScan) ... ok testScatter (test_cco_pr_buf.TestCCOBufInplaceWorld.testScatter) ... ok testAllgather (test_cco_pr_buf.TestCCOBufSelf.testAllgather) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufSelf.testAllreduce) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufSelf.testAlltoall) ... ok testBarrier (test_cco_pr_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufSelf.testBcast) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufSelf.testBcastTypeIndexed) ... ok testExscan (test_cco_pr_buf.TestCCOBufSelf.testExscan) ... ok testGather (test_cco_pr_buf.TestCCOBufSelf.testGather) ... ok testReduce (test_cco_pr_buf.TestCCOBufSelf.testReduce) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufSelf.testReduceScatter) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufSelf.testReduceScatterBlock) ... ok testScan (test_cco_pr_buf.TestCCOBufSelf.testScan) ... ok testScatter (test_cco_pr_buf.TestCCOBufSelf.testScatter) ... ok testAllgather (test_cco_pr_buf.TestCCOBufSelfDup.testAllgather) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufSelfDup.testAllreduce) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufSelfDup.testAlltoall) ... ok testBarrier (test_cco_pr_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufSelfDup.testBcast) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... ok testExscan (test_cco_pr_buf.TestCCOBufSelfDup.testExscan) ... ok testGather (test_cco_pr_buf.TestCCOBufSelfDup.testGather) ... ok testReduce (test_cco_pr_buf.TestCCOBufSelfDup.testReduce) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufSelfDup.testReduceScatter) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... ok testScan (test_cco_pr_buf.TestCCOBufSelfDup.testScan) ... ok testScatter (test_cco_pr_buf.TestCCOBufSelfDup.testScatter) ... ok testAllgather (test_cco_pr_buf.TestCCOBufWorld.testAllgather) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufWorld.testAllreduce) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufWorld.testAlltoall) ... ok testBarrier (test_cco_pr_buf.TestCCOBufWorld.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufWorld.testBcast) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufWorld.testBcastTypeIndexed) ... ok testExscan (test_cco_pr_buf.TestCCOBufWorld.testExscan) ... ok testGather (test_cco_pr_buf.TestCCOBufWorld.testGather) ... ok testReduce (test_cco_pr_buf.TestCCOBufWorld.testReduce) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufWorld.testReduceScatter) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufWorld.testReduceScatterBlock) ... ok testScan (test_cco_pr_buf.TestCCOBufWorld.testScan) ... ok testScatter (test_cco_pr_buf.TestCCOBufWorld.testScatter) ... ok testAllgather (test_cco_pr_buf.TestCCOBufWorldDup.testAllgather) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufWorldDup.testAllreduce) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufWorldDup.testAlltoall) ... ok testBarrier (test_cco_pr_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufWorldDup.testBcast) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... ok testExscan (test_cco_pr_buf.TestCCOBufWorldDup.testExscan) ... ok testGather (test_cco_pr_buf.TestCCOBufWorldDup.testGather) ... ok testReduce (test_cco_pr_buf.TestCCOBufWorldDup.testReduce) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufWorldDup.testReduceScatter) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... ok testScan (test_cco_pr_buf.TestCCOBufWorldDup.testScan) ... ok testScatter (test_cco_pr_buf.TestCCOBufWorldDup.testScatter) ... ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAlltoall) ... ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallw) ... ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoall) ... ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallw) ... ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAlltoall) ... ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallw) ... ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoall) ... ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallw) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallv) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallw) ... ok testAlltoallw2 (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallv) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallw) ... ok testAlltoallw2 (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... ok testAllgatherv (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv2) ... ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv3) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv) ... ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv2) ... ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv3) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecSelf.testAlltoallw) ... ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecSelf.testAlltoallwBottom) ... ok testGatherv (test_cco_pr_vec.TestCCOVecSelf.testGatherv) ... ok testGatherv2 (test_cco_pr_vec.TestCCOVecSelf.testGatherv2) ... ok testGatherv3 (test_cco_pr_vec.TestCCOVecSelf.testGatherv3) ... ok testScatterv (test_cco_pr_vec.TestCCOVecSelf.testScatterv) ... ok testScatterv2 (test_cco_pr_vec.TestCCOVecSelf.testScatterv2) ... ok testScatterv3 (test_cco_pr_vec.TestCCOVecSelf.testScatterv3) ... ok testAllgatherv (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv2) ... ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv3) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv) ... ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv2) ... ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv3) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallw) ... ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... ok testGatherv (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv) ... ok testGatherv2 (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv2) ... ok testGatherv3 (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv3) ... ok testScatterv (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv) ... ok testScatterv2 (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv2) ... ok testScatterv3 (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv3) ... ok testAllgatherv (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv2) ... ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv3) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv) ... ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv2) ... ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv3) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecWorld.testAlltoallw) ... ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecWorld.testAlltoallwBottom) ... ok testGatherv (test_cco_pr_vec.TestCCOVecWorld.testGatherv) ... ok testGatherv2 (test_cco_pr_vec.TestCCOVecWorld.testGatherv2) ... ok testGatherv3 (test_cco_pr_vec.TestCCOVecWorld.testGatherv3) ... ok testScatterv (test_cco_pr_vec.TestCCOVecWorld.testScatterv) ... ok testScatterv2 (test_cco_pr_vec.TestCCOVecWorld.testScatterv2) ... ok testScatterv3 (test_cco_pr_vec.TestCCOVecWorld.testScatterv3) ... ok testAllgatherv (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv2) ... ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv3) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv) ... ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv2) ... ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv3) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallw) ... ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... ok testGatherv (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv) ... ok testGatherv2 (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv2) ... ok testGatherv3 (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv3) ... ok testScatterv (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv) ... ok testScatterv2 (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv2) ... ok testScatterv3 (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv3) ... ok testAlltoallv (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallv) ... ok testAlltoallw (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallw) ... ok testAlltoallw2 (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... ok testAlltoallv (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallv) ... ok testAlltoallw (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallw) ... ok testAlltoallw2 (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... ok testAllgatherv (test_cco_vec.TestCCOVecSelf.testAllgatherv) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecSelf.testAllgatherv2) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecSelf.testAllgatherv3) ... ok testAlltoallv (test_cco_vec.TestCCOVecSelf.testAlltoallv) ... ok testAlltoallv2 (test_cco_vec.TestCCOVecSelf.testAlltoallv2) ... ok testAlltoallv3 (test_cco_vec.TestCCOVecSelf.testAlltoallv3) ... ok testAlltoallw (test_cco_vec.TestCCOVecSelf.testAlltoallw) ... ok testAlltoallwBottom (test_cco_vec.TestCCOVecSelf.testAlltoallwBottom) ... ok testGatherv (test_cco_vec.TestCCOVecSelf.testGatherv) ... ok testGatherv2 (test_cco_vec.TestCCOVecSelf.testGatherv2) ... ok testGatherv3 (test_cco_vec.TestCCOVecSelf.testGatherv3) ... ok testScatterv (test_cco_vec.TestCCOVecSelf.testScatterv) ... ok testScatterv2 (test_cco_vec.TestCCOVecSelf.testScatterv2) ... ok testScatterv3 (test_cco_vec.TestCCOVecSelf.testScatterv3) ... ok testAllgatherv (test_cco_vec.TestCCOVecSelfDup.testAllgatherv) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecSelfDup.testAllgatherv2) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecSelfDup.testAllgatherv3) ... ok testAlltoallv (test_cco_vec.TestCCOVecSelfDup.testAlltoallv) ... ok testAlltoallv2 (test_cco_vec.TestCCOVecSelfDup.testAlltoallv2) ... ok testAlltoallv3 (test_cco_vec.TestCCOVecSelfDup.testAlltoallv3) ... ok testAlltoallw (test_cco_vec.TestCCOVecSelfDup.testAlltoallw) ... ok testAlltoallwBottom (test_cco_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... ok testGatherv (test_cco_vec.TestCCOVecSelfDup.testGatherv) ... ok testGatherv2 (test_cco_vec.TestCCOVecSelfDup.testGatherv2) ... ok testGatherv3 (test_cco_vec.TestCCOVecSelfDup.testGatherv3) ... ok testScatterv (test_cco_vec.TestCCOVecSelfDup.testScatterv) ... ok testScatterv2 (test_cco_vec.TestCCOVecSelfDup.testScatterv2) ... ok testScatterv3 (test_cco_vec.TestCCOVecSelfDup.testScatterv3) ... ok testAllgatherv (test_cco_vec.TestCCOVecWorld.testAllgatherv) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecWorld.testAllgatherv2) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecWorld.testAllgatherv3) ... ok testAlltoallv (test_cco_vec.TestCCOVecWorld.testAlltoallv) ... ok testAlltoallv2 (test_cco_vec.TestCCOVecWorld.testAlltoallv2) ... ok testAlltoallv3 (test_cco_vec.TestCCOVecWorld.testAlltoallv3) ... ok testAlltoallw (test_cco_vec.TestCCOVecWorld.testAlltoallw) ... ok testAlltoallwBottom (test_cco_vec.TestCCOVecWorld.testAlltoallwBottom) ... ok testGatherv (test_cco_vec.TestCCOVecWorld.testGatherv) ... ok testGatherv2 (test_cco_vec.TestCCOVecWorld.testGatherv2) ... ok testGatherv3 (test_cco_vec.TestCCOVecWorld.testGatherv3) ... ok testScatterv (test_cco_vec.TestCCOVecWorld.testScatterv) ... ok testScatterv2 (test_cco_vec.TestCCOVecWorld.testScatterv2) ... ok testScatterv3 (test_cco_vec.TestCCOVecWorld.testScatterv3) ... ok testAllgatherv (test_cco_vec.TestCCOVecWorldDup.testAllgatherv) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecWorldDup.testAllgatherv2) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecWorldDup.testAllgatherv3) ... ok testAlltoallv (test_cco_vec.TestCCOVecWorldDup.testAlltoallv) ... ok testAlltoallv2 (test_cco_vec.TestCCOVecWorldDup.testAlltoallv2) ... ok testAlltoallv3 (test_cco_vec.TestCCOVecWorldDup.testAlltoallv3) ... ok testAlltoallw (test_cco_vec.TestCCOVecWorldDup.testAlltoallw) ... ok testAlltoallwBottom (test_cco_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... ok testGatherv (test_cco_vec.TestCCOVecWorldDup.testGatherv) ... ok testGatherv2 (test_cco_vec.TestCCOVecWorldDup.testGatherv2) ... ok testGatherv3 (test_cco_vec.TestCCOVecWorldDup.testGatherv3) ... ok testScatterv (test_cco_vec.TestCCOVecWorldDup.testScatterv) ... ok testScatterv2 (test_cco_vec.TestCCOVecWorldDup.testScatterv2) ... ok testScatterv3 (test_cco_vec.TestCCOVecWorldDup.testScatterv3) ... ok testAllgatherv (test_cco_vec_inter.TestCCOVecInter.testAllgatherv) ... skipped 'mpi-world-size<2' testAlltoallv (test_cco_vec_inter.TestCCOVecInter.testAlltoallv) ... skipped 'mpi-world-size<2' testAlltoallw (test_cco_vec_inter.TestCCOVecInter.testAlltoallw) ... skipped 'mpi-world-size<2' testGatherv (test_cco_vec_inter.TestCCOVecInter.testGatherv) ... skipped 'mpi-world-size<2' testScatterv (test_cco_vec_inter.TestCCOVecInter.testScatterv) ... skipped 'mpi-world-size<2' testAllgatherv (test_cco_vec_inter.TestCCOVecInterDup.testAllgatherv) ... skipped 'mpi-world-size<2' testAlltoallv (test_cco_vec_inter.TestCCOVecInterDup.testAlltoallv) ... skipped 'mpi-world-size<2' testAlltoallw (test_cco_vec_inter.TestCCOVecInterDup.testAlltoallw) ... skipped 'mpi-world-size<2' testGatherv (test_cco_vec_inter.TestCCOVecInterDup.testGatherv) ... skipped 'mpi-world-size<2' testScatterv (test_cco_vec_inter.TestCCOVecInterDup.testScatterv) ... skipped 'mpi-world-size<2' testHandleAddress (test_cffi.TestCFFI.testHandleAddress) ... skipped 'cffi' testHandleValue (test_cffi.TestCFFI.testHandleValue) ... skipped 'cffi' testConstructor (test_comm.TestCommNull.testConstructor) ... ok testConstructorInter (test_comm.TestCommNull.testConstructorInter) ... ok testConstructorIntra (test_comm.TestCommNull.testConstructorIntra) ... ok testGetName (test_comm.TestCommNull.testGetName) ... ok testPickle (test_comm.TestCommNull.testPickle) ... ok testBuffering (test_comm.TestCommSelf.testBuffering) ... ok testCloneFree (test_comm.TestCommSelf.testCloneFree) ... ok testCompare (test_comm.TestCommSelf.testCompare) ... ok testConstructor (test_comm.TestCommSelf.testConstructor) ... ok testCreate (test_comm.TestCommSelf.testCreate) ... ok testCreateFromGroup (test_comm.TestCommSelf.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommSelf.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommSelf.testDupWithInfo) ... ok testGetParent (test_comm.TestCommSelf.testGetParent) ... ok testGetSetInfo (test_comm.TestCommSelf.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommSelf.testGetSetName) ... ok testGroup (test_comm.TestCommSelf.testGroup) ... ok testIDup (test_comm.TestCommSelf.testIDup) ... ok testIDupWithInfo (test_comm.TestCommSelf.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommSelf.testIsInter) ... ok testPickle (test_comm.TestCommSelf.testPickle) ... ok testPyProps (test_comm.TestCommSelf.testPyProps) ... ok testRank (test_comm.TestCommSelf.testRank) ... ok testSize (test_comm.TestCommSelf.testSize) ... ok testSplit (test_comm.TestCommSelf.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommSelf.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommSelf.testSplitTypeHWUnguided) ... ok testSplitTypeResourceGuided (test_comm.TestCommSelf.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommSelf.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommSelfDup.testBuffering) ... ok testCloneFree (test_comm.TestCommSelfDup.testCloneFree) ... ok testCompare (test_comm.TestCommSelfDup.testCompare) ... ok testConstructor (test_comm.TestCommSelfDup.testConstructor) ... ok testCreate (test_comm.TestCommSelfDup.testCreate) ... ok testCreateFromGroup (test_comm.TestCommSelfDup.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommSelfDup.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommSelfDup.testDupWithInfo) ... ok testGetParent (test_comm.TestCommSelfDup.testGetParent) ... ok testGetSetInfo (test_comm.TestCommSelfDup.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommSelfDup.testGetSetName) ... ok testGroup (test_comm.TestCommSelfDup.testGroup) ... ok testIDup (test_comm.TestCommSelfDup.testIDup) ... ok testIDupWithInfo (test_comm.TestCommSelfDup.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommSelfDup.testIsInter) ... ok testPickle (test_comm.TestCommSelfDup.testPickle) ... ok testPyProps (test_comm.TestCommSelfDup.testPyProps) ... ok testRank (test_comm.TestCommSelfDup.testRank) ... ok testSize (test_comm.TestCommSelfDup.testSize) ... ok testSplit (test_comm.TestCommSelfDup.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommSelfDup.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommSelfDup.testSplitTypeHWUnguided) ... ok testSplitTypeResourceGuided (test_comm.TestCommSelfDup.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommSelfDup.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommWorld.testBuffering) ... ok testCloneFree (test_comm.TestCommWorld.testCloneFree) ... ok testCompare (test_comm.TestCommWorld.testCompare) ... ok testConstructor (test_comm.TestCommWorld.testConstructor) ... ok testCreate (test_comm.TestCommWorld.testCreate) ... ok testCreateFromGroup (test_comm.TestCommWorld.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommWorld.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommWorld.testDupWithInfo) ... ok testGetParent (test_comm.TestCommWorld.testGetParent) ... ok testGetSetInfo (test_comm.TestCommWorld.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommWorld.testGetSetName) ... ok testGroup (test_comm.TestCommWorld.testGroup) ... ok testIDup (test_comm.TestCommWorld.testIDup) ... ok testIDupWithInfo (test_comm.TestCommWorld.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommWorld.testIsInter) ... ok testPickle (test_comm.TestCommWorld.testPickle) ... ok testPyProps (test_comm.TestCommWorld.testPyProps) ... ok testRank (test_comm.TestCommWorld.testRank) ... ok testSize (test_comm.TestCommWorld.testSize) ... ok testSplit (test_comm.TestCommWorld.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommWorld.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommWorld.testSplitTypeHWUnguided) ... ok testSplitTypeResourceGuided (test_comm.TestCommWorld.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommWorld.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommWorldDup.testBuffering) ... ok testCloneFree (test_comm.TestCommWorldDup.testCloneFree) ... ok testCompare (test_comm.TestCommWorldDup.testCompare) ... ok testConstructor (test_comm.TestCommWorldDup.testConstructor) ... ok testCreate (test_comm.TestCommWorldDup.testCreate) ... ok testCreateFromGroup (test_comm.TestCommWorldDup.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommWorldDup.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommWorldDup.testDupWithInfo) ... ok testGetParent (test_comm.TestCommWorldDup.testGetParent) ... ok testGetSetInfo (test_comm.TestCommWorldDup.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommWorldDup.testGetSetName) ... ok testGroup (test_comm.TestCommWorldDup.testGroup) ... ok testIDup (test_comm.TestCommWorldDup.testIDup) ... ok testIDupWithInfo (test_comm.TestCommWorldDup.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommWorldDup.testIsInter) ... ok testPickle (test_comm.TestCommWorldDup.testPickle) ... ok testPyProps (test_comm.TestCommWorldDup.testPyProps) ... ok testRank (test_comm.TestCommWorldDup.testRank) ... ok testSize (test_comm.TestCommWorldDup.testSize) ... ok testSplit (test_comm.TestCommWorldDup.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommWorldDup.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommWorldDup.testSplitTypeHWUnguided) ... ok testSplitTypeResourceGuided (test_comm.TestCommWorldDup.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommWorldDup.testSplitTypeShared) ... ok testConstructor (test_comm_inter.TestIntercomm.testConstructor) ... skipped 'mpi-world-size<2' testCreateFromGroups (test_comm_inter.TestIntercomm.testCreateFromGroups) ... skipped 'mpi-world-size<2' testFortran (test_comm_inter.TestIntercomm.testFortran) ... skipped 'mpi-world-size<2' testLocalGroupSizeRank (test_comm_inter.TestIntercomm.testLocalGroupSizeRank) ... skipped 'mpi-world-size<2' testMerge (test_comm_inter.TestIntercomm.testMerge) ... skipped 'mpi-world-size<2' testPyProps (test_comm_inter.TestIntercomm.testPyProps) ... skipped 'mpi-world-size<2' testRemoteGroupSize (test_comm_inter.TestIntercomm.testRemoteGroupSize) ... skipped 'mpi-world-size<2' testSplit (test_comm_inter.TestIntercomm.testSplit) ... skipped 'mpi-world-size<2' testSplitTypeShared (test_comm_inter.TestIntercomm.testSplitTypeShared) ... skipped 'mpi-world-size<2' testHalf (test_comm_inter.TestIntercommCreateFromGroups.testHalf) ... skipped 'mpi-world-size<2' testPair (test_comm_inter.TestIntercommCreateFromGroups.testPair) ... skipped 'mpi-world-size<2' testConstructor (test_comm_inter.TestIntercommDup.testConstructor) ... skipped 'mpi-world-size<2' testCreateFromGroups (test_comm_inter.TestIntercommDup.testCreateFromGroups) ... skipped 'mpi-world-size<2' testFortran (test_comm_inter.TestIntercommDup.testFortran) ... skipped 'mpi-world-size<2' testLocalGroupSizeRank (test_comm_inter.TestIntercommDup.testLocalGroupSizeRank) ... skipped 'mpi-world-size<2' testMerge (test_comm_inter.TestIntercommDup.testMerge) ... skipped 'mpi-world-size<2' testPyProps (test_comm_inter.TestIntercommDup.testPyProps) ... skipped 'mpi-world-size<2' testRemoteGroupSize (test_comm_inter.TestIntercommDup.testRemoteGroupSize) ... skipped 'mpi-world-size<2' testSplit (test_comm_inter.TestIntercommDup.testSplit) ... skipped 'mpi-world-size<2' testSplitTypeShared (test_comm_inter.TestIntercommDup.testSplitTypeShared) ... skipped 'mpi-world-size<2' testConstructor (test_comm_inter.TestIntercommDupDup.testConstructor) ... skipped 'mpi-world-size<2' testCreateFromGroups (test_comm_inter.TestIntercommDupDup.testCreateFromGroups) ... skipped 'mpi-world-size<2' testFortran (test_comm_inter.TestIntercommDupDup.testFortran) ... skipped 'mpi-world-size<2' testLocalGroupSizeRank (test_comm_inter.TestIntercommDupDup.testLocalGroupSizeRank) ... skipped 'mpi-world-size<2' testMerge (test_comm_inter.TestIntercommDupDup.testMerge) ... skipped 'mpi-world-size<2' testPyProps (test_comm_inter.TestIntercommDupDup.testPyProps) ... skipped 'mpi-world-size<2' testRemoteGroupSize (test_comm_inter.TestIntercommDupDup.testRemoteGroupSize) ... skipped 'mpi-world-size<2' testSplit (test_comm_inter.TestIntercommDupDup.testSplit) ... skipped 'mpi-world-size<2' testSplitTypeShared (test_comm_inter.TestIntercommDupDup.testSplitTypeShared) ... skipped 'mpi-world-size<2' testConstructorCartcomm (test_comm_topo.TestTopoConstructor.testConstructorCartcomm) ... ok testConstructorDistGraphcomm (test_comm_topo.TestTopoConstructor.testConstructorDistGraphcomm) ... ok testConstructorGraphcomm (test_comm_topo.TestTopoConstructor.testConstructorGraphcomm) ... ok testConstructorTopocomm (test_comm_topo.TestTopoConstructor.testConstructorTopocomm) ... ok testCartMap (test_comm_topo.TestTopoSelf.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoSelf.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoSelf.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoSelf.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoSelf.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoSelf.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoSelf.testGraphcomm) ... ok testCartMap (test_comm_topo.TestTopoSelfDup.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoSelfDup.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoSelfDup.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoSelfDup.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoSelfDup.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoSelfDup.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoSelfDup.testGraphcomm) ... ok testCartMap (test_comm_topo.TestTopoWorld.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoWorld.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoWorld.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoWorld.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoWorld.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoWorld.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoWorld.testGraphcomm) ... ok testCartMap (test_comm_topo.TestTopoWorldDup.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoWorldDup.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoWorldDup.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoWorldDup.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoWorldDup.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoWorldDup.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoWorldDup.testGraphcomm) ... ok testHandleAdress (test_ctypes.TestCTYPES.testHandleAdress) ... ok testHandleValue (test_ctypes.TestCTYPES.testHandleValue) ... ok testBoolEqNe (test_datatype.TestDatatype.testBoolEqNe) ... ok testCodeCharStr (test_datatype.TestDatatype.testCodeCharStr) ... ok testCommit (test_datatype.TestDatatype.testCommit) ... ok testGetEnvelope (test_datatype.TestDatatype.testGetEnvelope) ... ok testGetExtent (test_datatype.TestDatatype.testGetExtent) ... ok testGetSetName (test_datatype.TestDatatype.testGetSetName) ... ok testGetSize (test_datatype.TestDatatype.testGetSize) ... ok testGetTrueExtent (test_datatype.TestDatatype.testGetTrueExtent) ... ok testGetValueIndex (test_datatype.TestDatatype.testGetValueIndex) ... ok testMatchSize (test_datatype.TestDatatype.testMatchSize) ... ok testContiguous (test_datatype.TestDatatypeCreate.testContiguous) ... ok testDarray (test_datatype.TestDatatypeCreate.testDarray) ... ok testDup (test_datatype.TestDatatypeCreate.testDup) ... ok testF90ComplexDouble (test_datatype.TestDatatypeCreate.testF90ComplexDouble) ... ok testF90ComplexSingle (test_datatype.TestDatatypeCreate.testF90ComplexSingle) ... ok testF90Integer (test_datatype.TestDatatypeCreate.testF90Integer) ... ok testF90RealDouble (test_datatype.TestDatatypeCreate.testF90RealDouble) ... ok testF90RealSingle (test_datatype.TestDatatypeCreate.testF90RealSingle) ... ok testHindexed (test_datatype.TestDatatypeCreate.testHindexed) ... ok testHindexedBlock (test_datatype.TestDatatypeCreate.testHindexedBlock) ... ok testHvector (test_datatype.TestDatatypeCreate.testHvector) ... ok testIndexed (test_datatype.TestDatatypeCreate.testIndexed) ... ok testIndexedBlock (test_datatype.TestDatatypeCreate.testIndexedBlock) ... ok testResized (test_datatype.TestDatatypeCreate.testResized) ... ok testStruct (test_datatype.TestDatatypeCreate.testStruct) ... ok testSubarray (test_datatype.TestDatatypeCreate.testSubarray) ... ok testValueIndex (test_datatype.TestDatatypeCreate.testValueIndex) ... ok testVector (test_datatype.TestDatatypeCreate.testVector) ... ok testConstructor (test_datatype.TestDatatypeNull.testConstructor) ... ok testGetName (test_datatype.TestDatatypeNull.testGetName) ... ok testContiguous (test_datatype.TestDatatypePickle.testContiguous) ... ok testDarray (test_datatype.TestDatatypePickle.testDarray) ... ok testDup (test_datatype.TestDatatypePickle.testDup) ... ok testF90ComplexDouble (test_datatype.TestDatatypePickle.testF90ComplexDouble) ... ok testF90ComplexSingle (test_datatype.TestDatatypePickle.testF90ComplexSingle) ... ok testF90Integer (test_datatype.TestDatatypePickle.testF90Integer) ... ok testF90RealDouble (test_datatype.TestDatatypePickle.testF90RealDouble) ... ok testF90RealSingle (test_datatype.TestDatatypePickle.testF90RealSingle) ... ok testHindexed (test_datatype.TestDatatypePickle.testHindexed) ... ok testHindexedBlock (test_datatype.TestDatatypePickle.testHindexedBlock) ... ok testHvector (test_datatype.TestDatatypePickle.testHvector) ... ok testIndexed (test_datatype.TestDatatypePickle.testIndexed) ... ok testIndexedBlock (test_datatype.TestDatatypePickle.testIndexedBlock) ... ok testNamed (test_datatype.TestDatatypePickle.testNamed) ... ok testResized (test_datatype.TestDatatypePickle.testResized) ... ok testStruct (test_datatype.TestDatatypePickle.testStruct) ... ok testSubarray (test_datatype.TestDatatypePickle.testSubarray) ... ok testValueIndex (test_datatype.TestDatatypePickle.testValueIndex) ... ok testVector (test_datatype.TestDatatypePickle.testVector) ... ok testDoc (test_doc.TestDoc.testDoc) ... ok testAcceptConnect (test_dynproc.TestDPM.testAcceptConnect) ... skipped 'mpi-world-size<2' testConnectAccept (test_dynproc.TestDPM.testConnectAccept) ... skipped 'mpi-world-size<2' testJoin (test_dynproc.TestDPM.testJoin) ... skipped 'mpi-world-size<2' testNamePublishing (test_dynproc.TestDPM.testNamePublishing) ... skipped 'mpi-world-size<2' testGetHWResourceInfo (test_environ.TestEnviron.testGetHWResourceInfo) ... ok testGetLibraryVersion (test_environ.TestEnviron.testGetLibraryVersion) ... ok testGetProcessorName (test_environ.TestEnviron.testGetProcessorName) ... ok testGetVersion (test_environ.TestEnviron.testGetVersion) ... ok testIsFinalized (test_environ.TestEnviron.testIsFinalized) ... ok testIsInitialized (test_environ.TestEnviron.testIsInitialized) ... ok testPControl (test_environ.TestEnviron.testPControl) ... ok testWTick (test_environ.TestEnviron.testWTick) ... ok testWTime (test_environ.TestEnviron.testWTime) ... ok testAppNum (test_environ.TestWorldAttrs.testAppNum) ... ok testIOProcessor (test_environ.TestWorldAttrs.testIOProcessor) ... ok testLastUsedCode (test_environ.TestWorldAttrs.testLastUsedCode) ... ok testUniverseSize (test_environ.TestWorldAttrs.testUniverseSize) ... ok testWTimeIsGlobal (test_environ.TestWorldAttrs.testWTimeIsGlobal) ... ok testPickle (test_errhandler.TestErrhandler.testPickle) ... ok testPredefined (test_errhandler.TestErrhandler.testPredefined) ... ok testCall (test_errhandler.TestErrhandlerComm.testCall) ... ok testCreate (test_errhandler.TestErrhandlerComm.testCreate) ... ok testErrorsAbort (test_errhandler.TestErrhandlerComm.testErrorsAbort) ... ok testErrorsFatal (test_errhandler.TestErrhandlerComm.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerComm.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerComm.testGetFree) ... ok testCall (test_errhandler.TestErrhandlerFile.testCall) ... ok testCreate (test_errhandler.TestErrhandlerFile.testCreate) ... ok testErrorsAbort (test_errhandler.TestErrhandlerFile.testErrorsAbort) ... ok testErrorsFatal (test_errhandler.TestErrhandlerFile.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerFile.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerFile.testGetFree) ... ok testCall (test_errhandler.TestErrhandlerSession.testCall) ... ok testCreate (test_errhandler.TestErrhandlerSession.testCreate) ... ok testErrorsAbort (test_errhandler.TestErrhandlerSession.testErrorsAbort) ... ok testErrorsFatal (test_errhandler.TestErrhandlerSession.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerSession.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerSession.testGetFree) ... ok testCall (test_errhandler.TestErrhandlerWin.testCall) ... ok testCreate (test_errhandler.TestErrhandlerWin.testCreate) ... ok testErrorsAbort (test_errhandler.TestErrhandlerWin.testErrorsAbort) ... ok testErrorsFatal (test_errhandler.TestErrhandlerWin.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerWin.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerWin.testGetFree) ... ok testAddErrorClass (test_errorcode.TestErrorCode.testAddErrorClass) ... ok testAddErrorClassCodeString (test_errorcode.TestErrorCode.testAddErrorClassCodeString) ... ok testAddErrorCode (test_errorcode.TestErrorCode.testAddErrorCode) ... ok testException (test_errorcode.TestErrorCode.testException) ... ok testGetErrorClass (test_errorcode.TestErrorCode.testGetErrorClass) ... ok testGetErrorStrings (test_errorcode.TestErrorCode.testGetErrorStrings) ... ok testFreeSelf (test_exceptions.TestExcComm.testFreeSelf) ... ok testFreeWorld (test_exceptions.TestExcComm.testFreeWorld) ... ok testKeyvalInvalid (test_exceptions.TestExcComm.testKeyvalInvalid) ... ok testAccessors (test_exceptions.TestExcCommNull.testAccessors) ... ok testCompare (test_exceptions.TestExcCommNull.testCompare) ... ok testDisconnect (test_exceptions.TestExcCommNull.testDisconnect) ... ok testFree (test_exceptions.TestExcCommNull.testFree) ... ok testGetAttr (test_exceptions.TestExcCommNull.testGetAttr) ... ok testGetErrhandler (test_exceptions.TestExcCommNull.testGetErrhandler) ... ok testInterNull (test_exceptions.TestExcCommNull.testInterNull) ... ok testIntraNull (test_exceptions.TestExcCommNull.testIntraNull) ... ok testSetErrhandler (test_exceptions.TestExcCommNull.testSetErrhandler) ... ok testFreePredefined (test_exceptions.TestExcDatatype.testFreePredefined) ... ok testKeyvalInvalid (test_exceptions.TestExcDatatype.testKeyvalInvalid) ... ok testCommit (test_exceptions.TestExcDatatypeNull.testCommit) ... ok testDup (test_exceptions.TestExcDatatypeNull.testDup) ... ok testFree (test_exceptions.TestExcDatatypeNull.testFree) ... ok testCommSelfSetErrhandler (test_exceptions.TestExcErrhandlerNull.testCommSelfSetErrhandler) ... ok testCommWorldSetErrhandler (test_exceptions.TestExcErrhandlerNull.testCommWorldSetErrhandler) ... ok testFree (test_exceptions.TestExcErrhandlerNull.testFree) ... ok testAccessors (test_exceptions.TestExcGroupNull.testAccessors) ... ok testCompare (test_exceptions.TestExcGroupNull.testCompare) ... ok testDelete (test_exceptions.TestExcInfo.testDelete) ... ok testGetNthKey (test_exceptions.TestExcInfo.testGetNthKey) ... ok testDelete (test_exceptions.TestExcInfoNull.testDelete) ... ok testDup (test_exceptions.TestExcInfoNull.testDup) ... ok testFree (test_exceptions.TestExcInfoNull.testFree) ... ok testGet (test_exceptions.TestExcInfoNull.testGet) ... ok testGetNKeys (test_exceptions.TestExcInfoNull.testGetNKeys) ... ok testGetNthKey (test_exceptions.TestExcInfoNull.testGetNthKey) ... ok testSet (test_exceptions.TestExcInfoNull.testSet) ... ok testTruth (test_exceptions.TestExcInfoNull.testTruth) ... ok testFreePredefined (test_exceptions.TestExcOp.testFreePredefined) ... ok testFree (test_exceptions.TestExcOpNull.testFree) ... ok testCancel (test_exceptions.TestExcRequestNull.testCancel) ... ok testFree (test_exceptions.TestExcRequestNull.testFree) ... ok testCreateGroup (test_exceptions.TestExcSession.testCreateGroup) ... ok testGetNthPsetNeg (test_exceptions.TestExcSession.testGetNthPsetNeg) ... ok testGetNthPsetPos (test_exceptions.TestExcSession.testGetNthPsetPos) ... ok testGetPsetInfo (test_exceptions.TestExcSession.testGetPsetInfo) ... ok testCreateGroup (test_exceptions.TestExcSessionNull.testCreateGroup) ... ok testGetErrhandler (test_exceptions.TestExcSessionNull.testGetErrhandler) ... ok testGetInfo (test_exceptions.TestExcSessionNull.testGetInfo) ... ok testGetNthPset (test_exceptions.TestExcSessionNull.testGetNthPset) ... ok testGetNumPsets (test_exceptions.TestExcSessionNull.testGetNumPsets) ... ok testGetPsetInfo (test_exceptions.TestExcSessionNull.testGetPsetInfo) ... ok testSetErrhandler (test_exceptions.TestExcSessionNull.testSetErrhandler) ... ok testGetCount (test_exceptions.TestExcStatus.testGetCount) ... ok testGetElements (test_exceptions.TestExcStatus.testGetElements) ... ok testSetElements (test_exceptions.TestExcStatus.testSetElements) ... ok testKeyvalInvalid (test_exceptions.TestExcWin.testKeyvalInvalid) ... ok testCallErrhandler (test_exceptions.TestExcWinNull.testCallErrhandler) ... ok testFree (test_exceptions.TestExcWinNull.testFree) ... ok testGetErrhandler (test_exceptions.TestExcWinNull.testGetErrhandler) ... ok testSetErrhandler (test_exceptions.TestExcWinNull.testSetErrhandler) ... ok testGetSetErrhandler (test_file.TestFileNull.testGetSetErrhandler) ... ok testBytes (test_file.TestFilePath.testBytes) ... ok testGetAmode (test_file.TestFilePath.testGetAmode) ... ok testGetByteOffset (test_file.TestFilePath.testGetByteOffset) ... ok testGetErrhandler (test_file.TestFilePath.testGetErrhandler) ... ok testGetGroup (test_file.TestFilePath.testGetGroup) ... ok testGetSetAtomicity (test_file.TestFilePath.testGetSetAtomicity) ... ok testGetSetInfo (test_file.TestFilePath.testGetSetInfo) ... ok testGetSetSize (test_file.TestFilePath.testGetSetSize) ... ok testGetSetView (test_file.TestFilePath.testGetSetView) ... ok testGetTypeExtent (test_file.TestFilePath.testGetTypeExtent) ... ok testPath (test_file.TestFilePath.testPath) ... ok testPickle (test_file.TestFilePath.testPickle) ... ok testPreallocate (test_file.TestFilePath.testPreallocate) ... ok testPyProps (test_file.TestFilePath.testPyProps) ... ok testSeekGetPosition (test_file.TestFilePath.testSeekGetPosition) ... ok testSeekGetPositionShared (test_file.TestFilePath.testSeekGetPositionShared) ... ok testStr (test_file.TestFilePath.testStr) ... ok testSync (test_file.TestFilePath.testSync) ... ok testGetAmode (test_file.TestFileSelf.testGetAmode) ... ok testGetByteOffset (test_file.TestFileSelf.testGetByteOffset) ... ok testGetErrhandler (test_file.TestFileSelf.testGetErrhandler) ... ok testGetGroup (test_file.TestFileSelf.testGetGroup) ... ok testGetSetAtomicity (test_file.TestFileSelf.testGetSetAtomicity) ... ok testGetSetInfo (test_file.TestFileSelf.testGetSetInfo) ... ok testGetSetSize (test_file.TestFileSelf.testGetSetSize) ... ok testGetSetView (test_file.TestFileSelf.testGetSetView) ... ok testGetTypeExtent (test_file.TestFileSelf.testGetTypeExtent) ... ok testPickle (test_file.TestFileSelf.testPickle) ... ok testPreallocate (test_file.TestFileSelf.testPreallocate) ... ok testPyProps (test_file.TestFileSelf.testPyProps) ... ok testSeekGetPosition (test_file.TestFileSelf.testSeekGetPosition) ... ok testSeekGetPositionShared (test_file.TestFileSelf.testSeekGetPositionShared) ... ok testSync (test_file.TestFileSelf.testSync) ... ok testFortran (test_fortran.TestFortranComm.testFortran) ... ok testFortran (test_fortran.TestFortranDatatype.testFortran) ... ok testFortran (test_fortran.TestFortranErrhandler.testFortran) ... ok testFortran (test_fortran.TestFortranFile.testFortran) ... ok testFortran (test_fortran.TestFortranGroup.testFortran) ... ok testFortran (test_fortran.TestFortranInfo.testFortran) ... ok testFortran (test_fortran.TestFortranMessage.testFortran) ... ok testFortran (test_fortran.TestFortranOp.testFortran) ... ok testFortran (test_fortran.TestFortranRequest.testFortran) ... ok testFortran (test_fortran.TestFortranSession.testFortran) ... ok testFintArray (test_fortran.TestFortranStatus.testFintArray) ... ok testFortran (test_fortran.TestFortranStatus.testFortran) ... ok testFortran (test_fortran.TestFortranWin.testFortran) ... ok testAll (test_grequest.TestGrequest.testAll) ... ok testAll1 (test_grequest.TestGrequest.testAll1) ... ok testAll2 (test_grequest.TestGrequest.testAll2) ... ok testConstructor (test_grequest.TestGrequest.testConstructor) ... ok testExceptionHandling (test_grequest.TestGrequest.testExceptionHandling) ... ok testPyCompleteTest (test_grequest.TestGrequest.testPyCompleteTest) ... ok testPyCompleteWait (test_grequest.TestGrequest.testPyCompleteWait) ... ok testCompare (test_group.TestGroupEmpty.testCompare) ... ok testDifference (test_group.TestGroupEmpty.testDifference) ... ok testDup (test_group.TestGroupEmpty.testDup) ... ok testEmpty (test_group.TestGroupEmpty.testEmpty) ... ok testExcl (test_group.TestGroupEmpty.testExcl) ... ok testIncl (test_group.TestGroupEmpty.testIncl) ... ok testIntersection (test_group.TestGroupEmpty.testIntersection) ... ok testPickle (test_group.TestGroupEmpty.testPickle) ... ok testProperties (test_group.TestGroupEmpty.testProperties) ... ok testRangeExcl (test_group.TestGroupEmpty.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupEmpty.testRangeIncl) ... ok testRank (test_group.TestGroupEmpty.testRank) ... ok testSize (test_group.TestGroupEmpty.testSize) ... ok testTranslRanks (test_group.TestGroupEmpty.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupEmpty.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupEmpty.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupEmpty.testUnion) ... ok testConstructor (test_group.TestGroupNull.testConstructor) ... ok testNull (test_group.TestGroupNull.testNull) ... ok testPickle (test_group.TestGroupNull.testPickle) ... ok testCompare (test_group.TestGroupSelf.testCompare) ... ok testDifference (test_group.TestGroupSelf.testDifference) ... ok testDup (test_group.TestGroupSelf.testDup) ... ok testExcl (test_group.TestGroupSelf.testExcl) ... ok testIncl (test_group.TestGroupSelf.testIncl) ... ok testIntersection (test_group.TestGroupSelf.testIntersection) ... ok testPickle (test_group.TestGroupSelf.testPickle) ... ok testProperties (test_group.TestGroupSelf.testProperties) ... ok testRangeExcl (test_group.TestGroupSelf.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupSelf.testRangeIncl) ... ok testRank (test_group.TestGroupSelf.testRank) ... ok testSize (test_group.TestGroupSelf.testSize) ... ok testTranslRanks (test_group.TestGroupSelf.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupSelf.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupSelf.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupSelf.testUnion) ... ok testCompare (test_group.TestGroupWorld.testCompare) ... ok testDifference (test_group.TestGroupWorld.testDifference) ... ok testDup (test_group.TestGroupWorld.testDup) ... ok testExcl (test_group.TestGroupWorld.testExcl) ... ok testIncl (test_group.TestGroupWorld.testIncl) ... ok testIntersection (test_group.TestGroupWorld.testIntersection) ... ok testPickle (test_group.TestGroupWorld.testPickle) ... ok testProperties (test_group.TestGroupWorld.testProperties) ... ok testRangeExcl (test_group.TestGroupWorld.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupWorld.testRangeIncl) ... ok testRank (test_group.TestGroupWorld.testRank) ... ok testSize (test_group.TestGroupWorld.testSize) ... ok testTranslRanks (test_group.TestGroupWorld.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupWorld.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupWorld.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupWorld.testUnion) ... ok testCreate (test_info.TestInfo.testCreate) ... ok testCreateBad (test_info.TestInfo.testCreateBad) ... ok testDup (test_info.TestInfo.testDup) ... ok testGet (test_info.TestInfo.testGet) ... ok testGetNKeys (test_info.TestInfo.testGetNKeys) ... ok testGetSetDelete (test_info.TestInfo.testGetSetDelete) ... ok testPickle (test_info.TestInfo.testPickle) ... ok testPyMethods (test_info.TestInfo.testPyMethods) ... ok testTruth (test_info.TestInfo.testTruth) ... ok testCreateEnv (test_info.TestInfoEnv.testCreateEnv) ... ok testDup (test_info.TestInfoEnv.testDup) ... ok testPickle (test_info.TestInfoEnv.testPickle) ... ok testPyMethods (test_info.TestInfoEnv.testPyMethods) ... ok testTruth (test_info.TestInfoEnv.testTruth) ... ok testPickle (test_info.TestInfoNull.testPickle) ... ok testPyMethods (test_info.TestInfoNull.testPyMethods) ... ok testTruth (test_info.TestInfoNull.testTruth) ... ok testRegister (test_io.TestDatarep.testRegister) ... ok testIReadIWrite (test_io.TestIOBasicSelf.testIReadIWrite) ... ok testIReadIWriteAll (test_io.TestIOBasicSelf.testIReadIWriteAll) ... ok testIReadIWriteAt (test_io.TestIOBasicSelf.testIReadIWriteAt) ... ok testIReadIWriteAtAll (test_io.TestIOBasicSelf.testIReadIWriteAtAll) ... ok testIReadIWriteShared (test_io.TestIOBasicSelf.testIReadIWriteShared) ... ok testReadWrite (test_io.TestIOBasicSelf.testReadWrite) ... ok testReadWriteAll (test_io.TestIOBasicSelf.testReadWriteAll) ... ok testReadWriteAllBeginEnd (test_io.TestIOBasicSelf.testReadWriteAllBeginEnd) ... ok testReadWriteAt (test_io.TestIOBasicSelf.testReadWriteAt) ... ok testReadWriteAtAll (test_io.TestIOBasicSelf.testReadWriteAtAll) ... ok testReadWriteAtAllBeginEnd (test_io.TestIOBasicSelf.testReadWriteAtAllBeginEnd) ... ok testReadWriteOrdered (test_io.TestIOBasicSelf.testReadWriteOrdered) ... ok testReadWriteOrderedBeginEnd (test_io.TestIOBasicSelf.testReadWriteOrderedBeginEnd) ... ok testReadWriteShared (test_io.TestIOBasicSelf.testReadWriteShared) ... ok testIReadIWrite (test_io.TestIOBasicWorld.testIReadIWrite) ... ok testIReadIWriteAll (test_io.TestIOBasicWorld.testIReadIWriteAll) ... ok testIReadIWriteAt (test_io.TestIOBasicWorld.testIReadIWriteAt) ... ok testIReadIWriteAtAll (test_io.TestIOBasicWorld.testIReadIWriteAtAll) ... ok testIReadIWriteShared (test_io.TestIOBasicWorld.testIReadIWriteShared) ... ok testReadWrite (test_io.TestIOBasicWorld.testReadWrite) ... ok testReadWriteAll (test_io.TestIOBasicWorld.testReadWriteAll) ... ok testReadWriteAllBeginEnd (test_io.TestIOBasicWorld.testReadWriteAllBeginEnd) ... ok testReadWriteAt (test_io.TestIOBasicWorld.testReadWriteAt) ... ok testReadWriteAtAll (test_io.TestIOBasicWorld.testReadWriteAtAll) ... ok testReadWriteAtAllBeginEnd (test_io.TestIOBasicWorld.testReadWriteAtAllBeginEnd) ... ok testReadWriteOrdered (test_io.TestIOBasicWorld.testReadWriteOrdered) ... ok testReadWriteOrderedBeginEnd (test_io.TestIOBasicWorld.testReadWriteOrderedBeginEnd) ... ok testReadWriteShared (test_io.TestIOBasicWorld.testReadWriteShared) ... ok testContiguous (test_io.TestIOViewSelf.testContiguous) ... ok testDarrayBlock (test_io.TestIOViewSelf.testDarrayBlock) ... ok testDarrayCyclic (test_io.TestIOViewSelf.testDarrayCyclic) ... ok testDup (test_io.TestIOViewSelf.testDup) ... ok testHIndexed (test_io.TestIOViewSelf.testHIndexed) ... ok testHIndexedBlock (test_io.TestIOViewSelf.testHIndexedBlock) ... ok testHVector (test_io.TestIOViewSelf.testHVector) ... ok testIndexed (test_io.TestIOViewSelf.testIndexed) ... ok testIndexedBlock (test_io.TestIOViewSelf.testIndexedBlock) ... ok testNamed (test_io.TestIOViewSelf.testNamed) ... ok testStruct (test_io.TestIOViewSelf.testStruct) ... ok testSubarray (test_io.TestIOViewSelf.testSubarray) ... ok testVector (test_io.TestIOViewSelf.testVector) ... ok testContiguous (test_io.TestIOViewWorld.testContiguous) ... ok testDarrayBlock (test_io.TestIOViewWorld.testDarrayBlock) ... ok testDarrayCyclic (test_io.TestIOViewWorld.testDarrayCyclic) ... ok testDup (test_io.TestIOViewWorld.testDup) ... ok testHIndexed (test_io.TestIOViewWorld.testHIndexed) ... ok testHIndexedBlock (test_io.TestIOViewWorld.testHIndexedBlock) ... ok testHVector (test_io.TestIOViewWorld.testHVector) ... ok testIndexed (test_io.TestIOViewWorld.testIndexed) ... ok testIndexedBlock (test_io.TestIOViewWorld.testIndexedBlock) ... ok testNamed (test_io.TestIOViewWorld.testNamed) ... ok testStruct (test_io.TestIOViewWorld.testStruct) ... ok testSubarray (test_io.TestIOViewWorld.testSubarray) ... ok testVector (test_io.TestIOViewWorld.testVector) ... ok testLargeCountSymbols (test_mpiapi.TestMPIAPI.testLargeCountSymbols) ... ok testSymbolCoverage (test_mpiapi.TestMPIAPI.testSymbolCoverage) ... ok testMemory1 (test_mpimem.TestMemory.testMemory1) ... ok testMemory2 (test_mpimem.TestMemory.testMemory2) ... ok testMessageBad (test_msgspec.TestMessageBlock.testMessageBad) ... skipped 'mpi-world-size<2' testAttrEmpty (test_msgspec.TestMessageCAIBuf.testAttrEmpty) ... ok testAttrNone (test_msgspec.TestMessageCAIBuf.testAttrNone) ... ok testAttrType (test_msgspec.TestMessageCAIBuf.testAttrType) ... ok testDataMissing (test_msgspec.TestMessageCAIBuf.testDataMissing) ... ok testDataNone (test_msgspec.TestMessageCAIBuf.testDataNone) ... ok testDataType (test_msgspec.TestMessageCAIBuf.testDataType) ... ok testDataValue (test_msgspec.TestMessageCAIBuf.testDataValue) ... ok testDescrMissing (test_msgspec.TestMessageCAIBuf.testDescrMissing) ... ok testDescrNone (test_msgspec.TestMessageCAIBuf.testDescrNone) ... ok testDescrType (test_msgspec.TestMessageCAIBuf.testDescrType) ... ok testDescrWarning (test_msgspec.TestMessageCAIBuf.testDescrWarning) ... ok testMask (test_msgspec.TestMessageCAIBuf.testMask) ... ok testNonContiguous (test_msgspec.TestMessageCAIBuf.testNonContiguous) ... ok testReadonly (test_msgspec.TestMessageCAIBuf.testReadonly) ... ok testShapeMissing (test_msgspec.TestMessageCAIBuf.testShapeMissing) ... ok testShapeNone (test_msgspec.TestMessageCAIBuf.testShapeNone) ... ok testShapeType (test_msgspec.TestMessageCAIBuf.testShapeType) ... ok testShapeValue (test_msgspec.TestMessageCAIBuf.testShapeValue) ... ok testStridesMissing (test_msgspec.TestMessageCAIBuf.testStridesMissing) ... ok testStridesNone (test_msgspec.TestMessageCAIBuf.testStridesNone) ... ok testStridesType (test_msgspec.TestMessageCAIBuf.testStridesType) ... ok testTypestrEndian (test_msgspec.TestMessageCAIBuf.testTypestrEndian) ... ok testTypestrItemsize (test_msgspec.TestMessageCAIBuf.testTypestrItemsize) ... ok testTypestrMissing (test_msgspec.TestMessageCAIBuf.testTypestrMissing) ... ok testTypestrNone (test_msgspec.TestMessageCAIBuf.testTypestrNone) ... ok testTypestrType (test_msgspec.TestMessageCAIBuf.testTypestrType) ... ok testByteOffset (test_msgspec.TestMessageDLPackCPUBuf.testByteOffset) ... ok testCapsule (test_msgspec.TestMessageDLPackCPUBuf.testCapsule) ... ok testContiguous (test_msgspec.TestMessageDLPackCPUBuf.testContiguous) ... ok testDevice (test_msgspec.TestMessageDLPackCPUBuf.testDevice) ... ok testDtypeCode (test_msgspec.TestMessageDLPackCPUBuf.testDtypeCode) ... ok testDtypeLanes (test_msgspec.TestMessageDLPackCPUBuf.testDtypeLanes) ... ok testNdim (test_msgspec.TestMessageDLPackCPUBuf.testNdim) ... ok testReadonly (test_msgspec.TestMessageDLPackCPUBuf.testReadonly) ... ok testShape (test_msgspec.TestMessageDLPackCPUBuf.testShape) ... ok testStrides (test_msgspec.TestMessageDLPackCPUBuf.testStrides) ... ok testVersion (test_msgspec.TestMessageDLPackCPUBuf.testVersion) ... ok testMessageArray (test_msgspec.TestMessageRMA.testMessageArray) ... ok testMessageBad (test_msgspec.TestMessageRMA.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageRMA.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageRMA.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageRMA.testMessageBytes) ... ok testMessageCAIBuf (test_msgspec.TestMessageRMA.testMessageCAIBuf) ... ok testMessageCuPy (test_msgspec.TestMessageRMA.testMessageCuPy) ... skipped 'cupy' testMessageNone (test_msgspec.TestMessageRMA.testMessageNone) ... ok testMessageNumPy (test_msgspec.TestMessageRMA.testMessageNumPy) ... ok testMessageNumba (test_msgspec.TestMessageRMA.testMessageNumba) ... skipped 'numba' testMessageBad (test_msgspec.TestMessageReduce.testMessageBad) ... ok testMessageBad (test_msgspec.TestMessageReduceScatter.testMessageBad) ... ok testMessageBad (test_msgspec.TestMessageSimple.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageSimple.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageSimple.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageSimple.testMessageBytes) ... ok testMessageMemoryView (test_msgspec.TestMessageSimple.testMessageMemoryView) ... ok testMessageNone (test_msgspec.TestMessageSimple.testMessageNone) ... ok testArray1 (test_msgspec.TestMessageSimpleArray.testArray1) ... ok testArray2 (test_msgspec.TestMessageSimpleArray.testArray2) ... ok testArray3 (test_msgspec.TestMessageSimpleArray.testArray3) ... ok testArray4 (test_msgspec.TestMessageSimpleArray.testArray4) ... ok testArray5 (test_msgspec.TestMessageSimpleArray.testArray5) ... ok testArray6 (test_msgspec.TestMessageSimpleArray.testArray6) ... ok testBuffer (test_msgspec.TestMessageSimpleArray.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleCAIBuf.testArray1) ... ok testArray2 (test_msgspec.TestMessageSimpleCAIBuf.testArray2) ... ok testArray3 (test_msgspec.TestMessageSimpleCAIBuf.testArray3) ... ok testArray4 (test_msgspec.TestMessageSimpleCAIBuf.testArray4) ... ok testArray5 (test_msgspec.TestMessageSimpleCAIBuf.testArray5) ... ok testArray6 (test_msgspec.TestMessageSimpleCAIBuf.testArray6) ... ok testBuffer (test_msgspec.TestMessageSimpleCAIBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleCuPy.testArray1) ... skipped 'cupy' testArray2 (test_msgspec.TestMessageSimpleCuPy.testArray2) ... skipped 'cupy' testArray3 (test_msgspec.TestMessageSimpleCuPy.testArray3) ... skipped 'cupy' testArray4 (test_msgspec.TestMessageSimpleCuPy.testArray4) ... skipped 'cupy' testArray5 (test_msgspec.TestMessageSimpleCuPy.testArray5) ... skipped 'cupy' testArray6 (test_msgspec.TestMessageSimpleCuPy.testArray6) ... skipped 'cupy' testBuffer (test_msgspec.TestMessageSimpleCuPy.testBuffer) ... skipped 'cupy' testNotContiguous (test_msgspec.TestMessageSimpleCuPy.testNotContiguous) ... skipped 'cupy' testOrderC (test_msgspec.TestMessageSimpleCuPy.testOrderC) ... skipped 'cupy' testOrderFortran (test_msgspec.TestMessageSimpleCuPy.testOrderFortran) ... skipped 'cupy' testArray1 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray1) ... ok testArray2 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray2) ... ok testArray3 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray3) ... ok testArray4 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray4) ... ok testArray5 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray5) ... ok testArray6 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray6) ... ok testBuffer (test_msgspec.TestMessageSimpleDLPackCPUBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray1) ... ok testArray2 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray2) ... ok testArray3 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray3) ... ok testArray4 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray4) ... ok testArray5 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray5) ... ok testArray6 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray6) ... ok testBuffer (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray1) ... ok testArray2 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray2) ... ok testArray3 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray3) ... ok testArray4 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray4) ... ok testArray5 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray5) ... ok testArray6 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray6) ... ok testBuffer (test_msgspec.TestMessageSimpleDLPackGPUBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray1) ... ok testArray2 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray2) ... ok testArray3 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray3) ... ok testArray4 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray4) ... ok testArray5 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray5) ... ok testArray6 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray6) ... ok testBuffer (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleNumPy.testArray1) ... ok testArray2 (test_msgspec.TestMessageSimpleNumPy.testArray2) ... ok testArray3 (test_msgspec.TestMessageSimpleNumPy.testArray3) ... ok testArray4 (test_msgspec.TestMessageSimpleNumPy.testArray4) ... ok testArray5 (test_msgspec.TestMessageSimpleNumPy.testArray5) ... ok testArray6 (test_msgspec.TestMessageSimpleNumPy.testArray6) ... ok testBuffer (test_msgspec.TestMessageSimpleNumPy.testBuffer) ... ok testByteOrder (test_msgspec.TestMessageSimpleNumPy.testByteOrder) ... ok testNotContiguous (test_msgspec.TestMessageSimpleNumPy.testNotContiguous) ... ok testNotWriteable (test_msgspec.TestMessageSimpleNumPy.testNotWriteable) ... ok testOrderC (test_msgspec.TestMessageSimpleNumPy.testOrderC) ... ok testOrderFortran (test_msgspec.TestMessageSimpleNumPy.testOrderFortran) ... ok testReadonly (test_msgspec.TestMessageSimpleNumPy.testReadonly) ... ok testArray1 (test_msgspec.TestMessageSimpleNumba.testArray1) ... skipped 'numba' testArray2 (test_msgspec.TestMessageSimpleNumba.testArray2) ... skipped 'numba' testArray3 (test_msgspec.TestMessageSimpleNumba.testArray3) ... skipped 'numba' testArray4 (test_msgspec.TestMessageSimpleNumba.testArray4) ... skipped 'numba' testArray5 (test_msgspec.TestMessageSimpleNumba.testArray5) ... skipped 'numba' testArray6 (test_msgspec.TestMessageSimpleNumba.testArray6) ... skipped 'numba' testBuffer (test_msgspec.TestMessageSimpleNumba.testBuffer) ... skipped 'numba' testNotContiguous (test_msgspec.TestMessageSimpleNumba.testNotContiguous) ... skipped 'numba' testOrderC (test_msgspec.TestMessageSimpleNumba.testOrderC) ... skipped 'numba' testOrderFortran (test_msgspec.TestMessageSimpleNumba.testOrderFortran) ... skipped 'numba' testMessageBad (test_msgspec.TestMessageVector.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageVector.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageVector.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageVector.testMessageBytes) ... ok testMessageNone (test_msgspec.TestMessageVector.testMessageNone) ... ok testArray1 (test_msgspec.TestMessageVectorArray.testArray1) ... ok testArray2 (test_msgspec.TestMessageVectorArray.testArray2) ... ok testArray3 (test_msgspec.TestMessageVectorArray.testArray3) ... ok testArray4 (test_msgspec.TestMessageVectorArray.testArray4) ... ok testArray5 (test_msgspec.TestMessageVectorArray.testArray5) ... ok testArray6 (test_msgspec.TestMessageVectorArray.testArray6) ... ok testArray1 (test_msgspec.TestMessageVectorCAIBuf.testArray1) ... ok testArray2 (test_msgspec.TestMessageVectorCAIBuf.testArray2) ... ok testArray3 (test_msgspec.TestMessageVectorCAIBuf.testArray3) ... ok testArray4 (test_msgspec.TestMessageVectorCAIBuf.testArray4) ... ok testArray5 (test_msgspec.TestMessageVectorCAIBuf.testArray5) ... ok testArray6 (test_msgspec.TestMessageVectorCAIBuf.testArray6) ... ok testArray1 (test_msgspec.TestMessageVectorCuPy.testArray1) ... skipped 'cupy' testArray2 (test_msgspec.TestMessageVectorCuPy.testArray2) ... skipped 'cupy' testArray3 (test_msgspec.TestMessageVectorCuPy.testArray3) ... skipped 'cupy' testArray4 (test_msgspec.TestMessageVectorCuPy.testArray4) ... skipped 'cupy' testArray5 (test_msgspec.TestMessageVectorCuPy.testArray5) ... skipped 'cupy' testArray6 (test_msgspec.TestMessageVectorCuPy.testArray6) ... skipped 'cupy' testArray1 (test_msgspec.TestMessageVectorNumPy.testArray1) ... ok testArray2 (test_msgspec.TestMessageVectorNumPy.testArray2) ... ok testArray3 (test_msgspec.TestMessageVectorNumPy.testArray3) ... ok testArray4 (test_msgspec.TestMessageVectorNumPy.testArray4) ... ok testArray5 (test_msgspec.TestMessageVectorNumPy.testArray5) ... ok testArray6 (test_msgspec.TestMessageVectorNumPy.testArray6) ... ok testCountNumPyArray (test_msgspec.TestMessageVectorNumPy.testCountNumPyArray) ... ok testCountNumPyScalar (test_msgspec.TestMessageVectorNumPy.testCountNumPyScalar) ... ok testCountNumPyZeroDim (test_msgspec.TestMessageVectorNumPy.testCountNumPyZeroDim) ... ok testArray1 (test_msgspec.TestMessageVectorNumba.testArray1) ... skipped 'numba' testArray2 (test_msgspec.TestMessageVectorNumba.testArray2) ... skipped 'numba' testArray3 (test_msgspec.TestMessageVectorNumba.testArray3) ... skipped 'numba' testArray4 (test_msgspec.TestMessageVectorNumba.testArray4) ... skipped 'numba' testArray5 (test_msgspec.TestMessageVectorNumba.testArray5) ... skipped 'numba' testArray6 (test_msgspec.TestMessageVectorNumba.testArray6) ... skipped 'numba' testMessageArray (test_msgspec.TestMessageVectorW.testMessageArray) ... ok testMessageBad (test_msgspec.TestMessageVectorW.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageVectorW.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageVectorW.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageVectorW.testMessageBytes) ... ok testMessageCAIBuf (test_msgspec.TestMessageVectorW.testMessageCAIBuf) ... ok testMessageCuPy (test_msgspec.TestMessageVectorW.testMessageCuPy) ... skipped 'cupy' testMessageNumPy (test_msgspec.TestMessageVectorW.testMessageNumPy) ... ok testMessageNumba (test_msgspec.TestMessageVectorW.testMessageNumba) ... skipped 'numba' testCollectivesBlock (test_msgzero.TestMessageZeroSelf.testCollectivesBlock) ... ok testCollectivesVector (test_msgzero.TestMessageZeroSelf.testCollectivesVector) ... ok testPointToPoint (test_msgzero.TestMessageZeroSelf.testPointToPoint) ... ok testReductions (test_msgzero.TestMessageZeroSelf.testReductions) ... ok testCollectivesBlock (test_msgzero.TestMessageZeroWorld.testCollectivesBlock) ... ok testCollectivesVector (test_msgzero.TestMessageZeroWorld.testCollectivesVector) ... ok testPointToPoint (test_msgzero.TestMessageZeroWorld.testPointToPoint) ... ok testReductions (test_msgzero.TestMessageZeroWorld.testReductions) ... ok testAHandleOf (test_objmodel.TestObjModel.testAHandleOf) ... ok testAddressOf (test_objmodel.TestObjModel.testAddressOf) ... ok testBool (test_objmodel.TestObjModel.testBool) ... ok testCAPI (test_objmodel.TestObjModel.testCAPI) ... ok testCmp (test_objmodel.TestObjModel.testCmp) ... ok testConstants (test_objmodel.TestObjModel.testConstants) ... ok testEq (test_objmodel.TestObjModel.testEq) ... ok testHandle (test_objmodel.TestObjModel.testHandle) ... ok testHash (test_objmodel.TestObjModel.testHash) ... ok testInit (test_objmodel.TestObjModel.testInit) ... ok testNe (test_objmodel.TestObjModel.testNe) ... ok testReduce (test_objmodel.TestObjModel.testReduce) ... ok testSafeFreeConstant (test_objmodel.TestObjModel.testSafeFreeConstant) ... ok testSafeFreeCreated (test_objmodel.TestObjModel.testSafeFreeCreated) ... ok testSafeFreeNull (test_objmodel.TestObjModel.testSafeFreeNull) ... ok testSizeOf (test_objmodel.TestObjModel.testSizeOf) ... ok testWeakRef (test_objmodel.TestObjModel.testWeakRef) ... ok testCall (test_op.TestOp.testCall) ... ok testConstructor (test_op.TestOp.testConstructor) ... ok testCreate (test_op.TestOp.testCreate) ... ok testCreateMany (test_op.TestOp.testCreateMany) ... ok testIsCommutative (test_op.TestOp.testIsCommutative) ... ok testIsCommutativeExtra (test_op.TestOp.testIsCommutativeExtra) ... ok testIsPredefined (test_op.TestOp.testIsPredefined) ... ok testMinMax (test_op.TestOp.testMinMax) ... ok testMinMaxLoc (test_op.TestOp.testMinMaxLoc) ... ok testPicklePredefined (test_op.TestOp.testPicklePredefined) ... ok testPickleUserDefined (test_op.TestOp.testPickleUserDefined) ... ok testConstructor (test_p2p_buf.TestP2PBufSelf.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufSelf.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufSelf.testISendrecv) ... ok testISendrecvReplace (test_p2p_buf.TestP2PBufSelf.testISendrecvReplace) ... ok testPersistent (test_p2p_buf.TestP2PBufSelf.testPersistent) ... ok testProbe (test_p2p_buf.TestP2PBufSelf.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufSelf.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufSelf.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufSelf.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufSelf.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufSelf.testSendRecv) ... ok testSendrecv (test_p2p_buf.TestP2PBufSelf.testSendrecv) ... ok testSendrecvReplace (test_p2p_buf.TestP2PBufSelf.testSendrecvReplace) ... ok testConstructor (test_p2p_buf.TestP2PBufSelfDup.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufSelfDup.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufSelfDup.testISendrecv) ... ok testISendrecvReplace (test_p2p_buf.TestP2PBufSelfDup.testISendrecvReplace) ... ok testPersistent (test_p2p_buf.TestP2PBufSelfDup.testPersistent) ... ok testProbe (test_p2p_buf.TestP2PBufSelfDup.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufSelfDup.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufSelfDup.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufSelfDup.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufSelfDup.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufSelfDup.testSendRecv) ... ok testSendrecv (test_p2p_buf.TestP2PBufSelfDup.testSendrecv) ... ok testSendrecvReplace (test_p2p_buf.TestP2PBufSelfDup.testSendrecvReplace) ... ok testConstructor (test_p2p_buf.TestP2PBufWorld.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufWorld.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufWorld.testISendrecv) ... ok testISendrecvReplace (test_p2p_buf.TestP2PBufWorld.testISendrecvReplace) ... ok testPersistent (test_p2p_buf.TestP2PBufWorld.testPersistent) ... ok testProbe (test_p2p_buf.TestP2PBufWorld.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufWorld.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufWorld.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufWorld.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufWorld.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufWorld.testSendRecv) ... ok testSendrecv (test_p2p_buf.TestP2PBufWorld.testSendrecv) ... ok testSendrecvReplace (test_p2p_buf.TestP2PBufWorld.testSendrecvReplace) ... ok testConstructor (test_p2p_buf.TestP2PBufWorldDup.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufWorldDup.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufWorldDup.testISendrecv) ... ok testISendrecvReplace (test_p2p_buf.TestP2PBufWorldDup.testISendrecvReplace) ... ok testPersistent (test_p2p_buf.TestP2PBufWorldDup.testPersistent) ... ok testProbe (test_p2p_buf.TestP2PBufWorldDup.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufWorldDup.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufWorldDup.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufWorldDup.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufWorldDup.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufWorldDup.testSendRecv) ... ok testSendrecv (test_p2p_buf.TestP2PBufWorldDup.testSendrecv) ... ok testSendrecvReplace (test_p2p_buf.TestP2PBufWorldDup.testSendrecvReplace) ... ok testMessageNoProc (test_p2p_buf_matched.TestMessage.testMessageNoProc) ... ok testMessageNull (test_p2p_buf_matched.TestMessage.testMessageNull) ... ok testPickle (test_p2p_buf_matched.TestMessage.testPickle) ... ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedSelf.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedSelf.testProbeRecv) ... ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedSelfDup.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedSelfDup.testProbeRecv) ... ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedWorld.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedWorld.testProbeRecv) ... ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedWorldDup.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedWorldDup.testProbeRecv) ... ok testRing (test_p2p_buf_part.TestP2PBufPartSelf.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartSelf.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartSelf.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartSelfDup.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartSelfDup.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartSelfDup.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartWorld.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartWorld.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartWorld.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartWorldDup.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartWorldDup.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartWorldDup.testSelf) ... skipped 'mpi-p2p-part' testCancel (test_p2p_obj.TestP2PObjSelf.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjSelf.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndIBSend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjSelf.testIRecvAndISend) ... ok testIRecvAndSSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndSend) ... ok testISSendAndRecv (test_p2p_obj.TestP2PObjSelf.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjSelf.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjSelf.testManyISSendAndRecv) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjSelf.testManyISendAndRecv) ... ok testMixed (test_p2p_obj.TestP2PObjSelf.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjSelf.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjSelf.testProbe) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjSelf.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjSelf.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjSelf.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjSelf.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjSelf.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjSelf.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjSelf.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjSelf.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjSelfDup.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjSelfDup.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndIBSend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndISend) ... ok testIRecvAndSSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndSend) ... ok testISSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testManyISSendAndRecv) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testManyISendAndRecv) ... ok testMixed (test_p2p_obj.TestP2PObjSelfDup.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjSelfDup.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjSelfDup.testProbe) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjSelfDup.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjSelfDup.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjSelfDup.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjSelfDup.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjSelfDup.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjSelfDup.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjWorld.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjWorld.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndIBSend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjWorld.testIRecvAndISend) ... ok testIRecvAndSSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndSend) ... ok testISSendAndRecv (test_p2p_obj.TestP2PObjWorld.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjWorld.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjWorld.testManyISSendAndRecv) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjWorld.testManyISendAndRecv) ... ok testMixed (test_p2p_obj.TestP2PObjWorld.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjWorld.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjWorld.testProbe) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjWorld.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjWorld.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjWorld.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjWorld.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjWorld.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjWorld.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjWorld.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjWorld.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjWorldDup.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjWorldDup.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndIBSend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndISend) ... ok testIRecvAndSSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndSend) ... ok testISSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testManyISSendAndRecv) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testManyISendAndRecv) ... ok testMixed (test_p2p_obj.TestP2PObjWorldDup.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjWorldDup.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjWorldDup.testProbe) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjWorldDup.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjWorldDup.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjWorldDup.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjWorldDup.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjWorldDup.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjWorldDup.testWaitSomeSend) ... ok testMessageNoProc (test_p2p_obj_matched.TestMessage.testMessageNoProc) ... ok testMessageNull (test_p2p_obj_matched.TestMessage.testMessageNull) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedSelf.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedSelf.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedSelfDup.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedSelfDup.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedWorld.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedWorld.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedWorldDup.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedWorldDup.testProbeRecv) ... ok testPackSize (test_pack.TestPackExternal.testPackSize) ... ok testPackUnpackExternal (test_pack.TestPackExternal.testPackUnpackExternal) ... ok testPackSize (test_pack.TestPackSelf.testPackSize) ... ok testPackUnpack (test_pack.TestPackSelf.testPackUnpack) ... ok testPackSize (test_pack.TestPackWorld.testPackSize) ... ok testPackUnpack (test_pack.TestPackWorld.testPackUnpack) ... ok testCython (test_package.TestDataFiles.testCython) ... ok testHeaders (test_package.TestDataFiles.testHeaders) ... ok testTyping (test_package.TestDataFiles.testTyping) ... ok testImportBench (test_package.TestImport.testImportBench) ... ok testImportFutures (test_package.TestImport.testImportFutures) ... ok testImportMPI (test_package.TestImport.testImportMPI) ... ok testImportRun (test_package.TestImport.testImportRun) ... ok testImportTyping (test_package.TestImport.testImportTyping) ... ok testImportUtil (test_package.TestImport.testImportUtil) ... ok testDefault (test_pickle.TestPickle.testDefault) ... ok testDill (test_pickle.TestPickle.testDill) ... skipped 'dill' testJson (test_pickle.TestPickle.testJson) ... ok testMarshal (test_pickle.TestPickle.testMarshal) ... ok testPyPickle (test_pickle.TestPickle.testPyPickle) ... ok testYAML (test_pickle.TestPickle.testYAML) ... skipped 'yaml' testGetStatus (test_request.TestRequest.testGetStatus) ... ok testTest (test_request.TestRequest.testTest) ... ok testWait (test_request.TestRequest.testWait) ... ok testGetStatusAll (test_request.TestRequestArray.testGetStatusAll) ... ok testGetStatusAny (test_request.TestRequestArray.testGetStatusAny) ... ok testGetStatusSome (test_request.TestRequestArray.testGetStatusSome) ... ok testTestall (test_request.TestRequestArray.testTestall) ... ok testTestany (test_request.TestRequestArray.testTestany) ... ok testTestsome (test_request.TestRequestArray.testTestsome) ... ok testWaitall (test_request.TestRequestArray.testWaitall) ... ok testWaitany (test_request.TestRequestArray.testWaitany) ... ok testWaitsome (test_request.TestRequestArray.testWaitsome) ... ok testAccumulate (test_rma.TestRMASelf.testAccumulate) ... ok testAccumulateProcNullReplace (test_rma.TestRMASelf.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma.TestRMASelf.testAccumulateProcNullSum) ... ok testCompareAndSwap (test_rma.TestRMASelf.testCompareAndSwap) ... ok testFence (test_rma.TestRMASelf.testFence) ... ok testFenceAll (test_rma.TestRMASelf.testFenceAll) ... ok testFetchAndOp (test_rma.TestRMASelf.testFetchAndOp) ... ok testFlush (test_rma.TestRMASelf.testFlush) ... ok testGetAccumulate (test_rma.TestRMASelf.testGetAccumulate) ... ok testGetAccumulateProcNull (test_rma.TestRMASelf.testGetAccumulateProcNull) ... ok testGetProcNull (test_rma.TestRMASelf.testGetProcNull) ... ok testPostWait (test_rma.TestRMASelf.testPostWait) ... ok testPutGet (test_rma.TestRMASelf.testPutGet) ... ok testPutProcNull (test_rma.TestRMASelf.testPutProcNull) ... ok testStartComplete (test_rma.TestRMASelf.testStartComplete) ... ok testStartCompletePostTest (test_rma.TestRMASelf.testStartCompletePostTest) ... ok testStartCompletePostWait (test_rma.TestRMASelf.testStartCompletePostWait) ... ok testSync (test_rma.TestRMASelf.testSync) ... ok testAccumulate (test_rma.TestRMAWorld.testAccumulate) ... ok testAccumulateProcNullReplace (test_rma.TestRMAWorld.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma.TestRMAWorld.testAccumulateProcNullSum) ... ok testCompareAndSwap (test_rma.TestRMAWorld.testCompareAndSwap) ... ok testFence (test_rma.TestRMAWorld.testFence) ... ok testFenceAll (test_rma.TestRMAWorld.testFenceAll) ... ok testFetchAndOp (test_rma.TestRMAWorld.testFetchAndOp) ... ok testFlush (test_rma.TestRMAWorld.testFlush) ... ok testGetAccumulate (test_rma.TestRMAWorld.testGetAccumulate) ... ok testGetAccumulateProcNull (test_rma.TestRMAWorld.testGetAccumulateProcNull) ... ok testGetProcNull (test_rma.TestRMAWorld.testGetProcNull) ... ok testPostWait (test_rma.TestRMAWorld.testPostWait) ... ok testPutGet (test_rma.TestRMAWorld.testPutGet) ... ok testPutProcNull (test_rma.TestRMAWorld.testPutProcNull) ... ok testStartComplete (test_rma.TestRMAWorld.testStartComplete) ... ok testStartCompletePostTest (test_rma.TestRMAWorld.testStartCompletePostTest) ... ok testStartCompletePostWait (test_rma.TestRMAWorld.testStartCompletePostWait) ... ok testSync (test_rma.TestRMAWorld.testSync) ... ok testAccumulate (test_rma_nb.TestRMASelf.testAccumulate) ... ok testAccumulateProcNullReplace (test_rma_nb.TestRMASelf.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma_nb.TestRMASelf.testAccumulateProcNullSum) ... ok testGetAccumulate (test_rma_nb.TestRMASelf.testGetAccumulate) ... ok testGetProcNull (test_rma_nb.TestRMASelf.testGetProcNull) ... ok testPutGet (test_rma_nb.TestRMASelf.testPutGet) ... ok testPutProcNull (test_rma_nb.TestRMASelf.testPutProcNull) ... ok testAccumulate (test_rma_nb.TestRMAWorld.testAccumulate) ... ok testAccumulateProcNullReplace (test_rma_nb.TestRMAWorld.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma_nb.TestRMAWorld.testAccumulateProcNullSum) ... ok testGetAccumulate (test_rma_nb.TestRMAWorld.testGetAccumulate) ... ok testGetProcNull (test_rma_nb.TestRMAWorld.testGetProcNull) ... ok testPutGet (test_rma_nb.TestRMAWorld.testPutGet) ... ok testPutProcNull (test_rma_nb.TestRMAWorld.testPutProcNull) ... ok testBuffering (test_session.TestSession.testBuffering) ... ok testPickle (test_session.TestSession.testPickle) ... ok testSessionGetInfo (test_session.TestSession.testSessionGetInfo) ... ok testSessionInit (test_session.TestSession.testSessionInit) ... ok testSessionPsetGroup (test_session.TestSession.testSessionPsetGroup) ... ok testSessionPsetInfo (test_session.TestSession.testSessionPsetInfo) ... ok testSessionPsets (test_session.TestSession.testSessionPsets) ... ok testSessionSELF (test_session.TestSession.testSessionSELF) ... ok testSessionWORLD (test_session.TestSession.testSessionWORLD) ... ok testArgsBad (test_spawn.TestSpawnMultipleSelf.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleSelf.testArgsOnlyAtRoot) ... ok testCommSpawn (test_spawn.TestSpawnMultipleSelf.testCommSpawn) ... ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleSelf.testCommSpawnDefaults1) ... ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleSelf.testCommSpawnDefaults2) ... ok testErrcodes (test_spawn.TestSpawnMultipleSelf.testErrcodes) ... ok testNoArgs (test_spawn.TestSpawnMultipleSelf.testNoArgs) ... ok testArgsBad (test_spawn.TestSpawnMultipleSelfMany.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleSelfMany.testArgsOnlyAtRoot) ... ok testCommSpawn (test_spawn.TestSpawnMultipleSelfMany.testCommSpawn) ... ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleSelfMany.testCommSpawnDefaults1) ... ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleSelfMany.testCommSpawnDefaults2) ... ok testErrcodes (test_spawn.TestSpawnMultipleSelfMany.testErrcodes) ... ok testNoArgs (test_spawn.TestSpawnMultipleSelfMany.testNoArgs) ... ok testArgsBad (test_spawn.TestSpawnMultipleWorld.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleWorld.testArgsOnlyAtRoot) ... ok testCommSpawn (test_spawn.TestSpawnMultipleWorld.testCommSpawn) ... ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleWorld.testCommSpawnDefaults1) ... ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleWorld.testCommSpawnDefaults2) ... ok testErrcodes (test_spawn.TestSpawnMultipleWorld.testErrcodes) ... ok testNoArgs (test_spawn.TestSpawnMultipleWorld.testNoArgs) ... ok testArgsBad (test_spawn.TestSpawnMultipleWorldMany.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleWorldMany.testArgsOnlyAtRoot) ... ok testCommSpawn (test_spawn.TestSpawnMultipleWorldMany.testCommSpawn) ... ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleWorldMany.testCommSpawnDefaults1) ... ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleWorldMany.testCommSpawnDefaults2) ... ok testErrcodes (test_spawn.TestSpawnMultipleWorldMany.testErrcodes) ... ok testNoArgs (test_spawn.TestSpawnMultipleWorldMany.testNoArgs) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleSelf.testArgsOnlyAtRoot) ... ok testCommSpawn (test_spawn.TestSpawnSingleSelf.testCommSpawn) ... ok testErrcodes (test_spawn.TestSpawnSingleSelf.testErrcodes) ... ok testNoArgs (test_spawn.TestSpawnSingleSelf.testNoArgs) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleSelfMany.testArgsOnlyAtRoot) ... ok testCommSpawn (test_spawn.TestSpawnSingleSelfMany.testCommSpawn) ... ok testErrcodes (test_spawn.TestSpawnSingleSelfMany.testErrcodes) ... ok testNoArgs (test_spawn.TestSpawnSingleSelfMany.testNoArgs) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleWorld.testArgsOnlyAtRoot) ... ok testCommSpawn (test_spawn.TestSpawnSingleWorld.testCommSpawn) ... ok testErrcodes (test_spawn.TestSpawnSingleWorld.testErrcodes) ... ok testNoArgs (test_spawn.TestSpawnSingleWorld.testNoArgs) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleWorldMany.testArgsOnlyAtRoot) ... ok testCommSpawn (test_spawn.TestSpawnSingleWorldMany.testCommSpawn) ... ok testErrcodes (test_spawn.TestSpawnSingleWorldMany.testErrcodes) ... ok testNoArgs (test_spawn.TestSpawnSingleWorldMany.testNoArgs) ... ok testConstructor (test_status.TestStatus.testConstructor) ... ok testCopyConstructor (test_status.TestStatus.testCopyConstructor) ... ok testDefaultFieldValues (test_status.TestStatus.testDefaultFieldValues) ... ok testGetCount (test_status.TestStatus.testGetCount) ... ok testGetElements (test_status.TestStatus.testGetElements) ... ok testIsCancelled (test_status.TestStatus.testIsCancelled) ... ok testPickle (test_status.TestStatus.testPickle) ... ok testPyProps (test_status.TestStatus.testPyProps) ... ok testSetCancelled (test_status.TestStatus.testSetCancelled) ... ok testSetElements (test_status.TestStatus.testSetElements) ... ok testCloneFree (test_subclass.TestMyCartcommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyCartcommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyCartcommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyCartcommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyCartcommWORLD.testCloneFree) ... ok testSubType (test_subclass.TestMyCartcommWORLD.testSubType) ... ok testCloneFree (test_subclass.TestMyCommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyCommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyCommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyCommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyCommWORLD.testCloneFree) ... ok testSubType (test_subclass.TestMyCommWORLD.testSubType) ... ok testFree (test_subclass.TestMyFile.testFree) ... ok testSubType (test_subclass.TestMyFile.testSubType) ... ok testCloneFree (test_subclass.TestMyGraphcommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyGraphcommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyGraphcommWORLD.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommWORLD.testSubType) ... ok testSubType (test_subclass.TestMyGrequest.testSubType) ... ok testCreateDupType (test_subclass.TestMyInfo.testCreateDupType) ... ok testCreateEnvType (test_subclass.TestMyInfo.testCreateEnvType) ... ok testFree (test_subclass.TestMyInfo.testFree) ... ok testPickle (test_subclass.TestMyInfo.testPickle) ... ok testSubType (test_subclass.TestMyInfo.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommWORLD.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommWORLD.testSubType) ... ok testSubType (test_subclass.TestMyPrequest.testSubType) ... ok testStart (test_subclass.TestMyPrequest2.testStart) ... ok testSubType (test_subclass.TestMyPrequest2.testSubType) ... ok testSubType (test_subclass.TestMyRequest.testSubType) ... ok testSubType (test_subclass.TestMyRequest2.testSubType) ... ok testFree (test_subclass.TestMyWin.testFree) ... ok testSubType (test_subclass.TestMyWin.testSubType) ... ok testIsThreadMain (test_threads.TestMPIThreads.testIsThreadMain) ... ok testIsThreadMainInThread (test_threads.TestMPIThreads.testIsThreadMainInThread) ... ok testThreadLevels (test_threads.TestMPIThreads.testThreadLevels) ... ok testGetConfig (test_toplevel.TestConfig.testGetConfig) ... ok testGetInclude (test_toplevel.TestConfig.testGetInclude) ... ok testProfile (test_toplevel.TestProfile.testProfile) ... ok testBadAttribute (test_toplevel.TestRC.testBadAttribute) ... ok testCallKwArgs (test_toplevel.TestRC.testCallKwArgs) ... ok testInitKwArgs (test_toplevel.TestRC.testInitKwArgs) ... ok testRepr (test_toplevel.TestRC.testRepr) ... ok testAckFailed (test_ulfm.TestULFMInter.testAckFailed) ... skipped 'mpi-world-size<2' testAgree (test_ulfm.TestULFMInter.testAgree) ... skipped 'mpi-world-size<2' testGetFailed (test_ulfm.TestULFMInter.testGetFailed) ... skipped 'mpi-world-size<2' testIAgree (test_ulfm.TestULFMInter.testIAgree) ... skipped 'mpi-world-size<2' testIShrink (test_ulfm.TestULFMInter.testIShrink) ... skipped 'mpi-world-size<2' testIsRevoked (test_ulfm.TestULFMInter.testIsRevoked) ... skipped 'mpi-world-size<2' testRevoke (test_ulfm.TestULFMInter.testRevoke) ... skipped 'mpi-world-size<2' testShrink (test_ulfm.TestULFMInter.testShrink) ... skipped 'mpi-world-size<2' testAckFailed (test_ulfm.TestULFMSelf.testAckFailed) ... ok testAgree (test_ulfm.TestULFMSelf.testAgree) ... ok testGetFailed (test_ulfm.TestULFMSelf.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMSelf.testIAgree) ... ok testIShrink (test_ulfm.TestULFMSelf.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMSelf.testIsRevoked) ... ok testRevoke (test_ulfm.TestULFMSelf.testRevoke) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMSelf.testShrink) ... ok testAckFailed (test_ulfm.TestULFMWorld.testAckFailed) ... ok testAgree (test_ulfm.TestULFMWorld.testAgree) ... ok testGetFailed (test_ulfm.TestULFMWorld.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMWorld.testIAgree) ... ok testIShrink (test_ulfm.TestULFMWorld.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMWorld.testIsRevoked) ... ok testRevoke (test_ulfm.TestULFMWorld.testRevoke) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMWorld.testShrink) ... ok testAlignmentComplex (test_util_dtlib.TestUtilDTLib.testAlignmentComplex) ... ok testAlignmentPair (test_util_dtlib.TestUtilDTLib.testAlignmentPair) ... ok testAlignmentStruct (test_util_dtlib.TestUtilDTLib.testAlignmentStruct) ... ok testBasic (test_util_dtlib.TestUtilDTLib.testBasic) ... ok testF77 (test_util_dtlib.TestUtilDTLib.testF77) ... ok testF90 (test_util_dtlib.TestUtilDTLib.testF90) ... ok testF90Complex (test_util_dtlib.TestUtilDTLib.testF90Complex) ... ok testF90Integer (test_util_dtlib.TestUtilDTLib.testF90Integer) ... ok testF90Real (test_util_dtlib.TestUtilDTLib.testF90Real) ... ok testFailures (test_util_dtlib.TestUtilDTLib.testFailures) ... ok testHIndexed (test_util_dtlib.TestUtilDTLib.testHIndexed) ... ok testHVector (test_util_dtlib.TestUtilDTLib.testHVector) ... ok testIndexed (test_util_dtlib.TestUtilDTLib.testIndexed) ... ok testMissingNumPy (test_util_dtlib.TestUtilDTLib.testMissingNumPy) ... ok testPair (test_util_dtlib.TestUtilDTLib.testPair) ... ok testPairStruct (test_util_dtlib.TestUtilDTLib.testPairStruct) ... ok testStruct1 (test_util_dtlib.TestUtilDTLib.testStruct1) ... ok testStruct2 (test_util_dtlib.TestUtilDTLib.testStruct2) ... ok testStruct3 (test_util_dtlib.TestUtilDTLib.testStruct3) ... ok testStruct4 (test_util_dtlib.TestUtilDTLib.testStruct4) ... ok testStruct5 (test_util_dtlib.TestUtilDTLib.testStruct5) ... ok testSubarray1 (test_util_dtlib.TestUtilDTLib.testSubarray1) ... ok testSubarray2 (test_util_dtlib.TestUtilDTLib.testSubarray2) ... ok testVector (test_util_dtlib.TestUtilDTLib.testVector) ... ok testAllgatherInter (test_util_pkl5.TestMPISelf.testAllgatherInter) ... skipped 'comm.size==1' testAllgatherIntra (test_util_pkl5.TestMPISelf.testAllgatherIntra) ... ok testAlltoallInter (test_util_pkl5.TestMPISelf.testAlltoallInter) ... skipped 'comm.size==1' testAlltoallIntra (test_util_pkl5.TestMPISelf.testAlltoallIntra) ... ok testBSendAndRecv (test_util_pkl5.TestMPISelf.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestMPISelf.testBcastInter) ... skipped 'comm.size==1' testBcastIntra (test_util_pkl5.TestMPISelf.testBcastIntra) ... ok testBigMPI (test_util_pkl5.TestMPISelf.testBigMPI) ... skipped 'comm.size==1' testGatherInter (test_util_pkl5.TestMPISelf.testGatherInter) ... skipped 'comm.size==1' testGatherIntra (test_util_pkl5.TestMPISelf.testGatherIntra) ... ok testGetStatusAll (test_util_pkl5.TestMPISelf.testGetStatusAll) ... ok testIBSendAndRecv (test_util_pkl5.TestMPISelf.testIBSendAndRecv) ... ok testIMProbe (test_util_pkl5.TestMPISelf.testIMProbe) ... ok testISSendAndRecv (test_util_pkl5.TestMPISelf.testISSendAndRecv) ... ok testISSendCancel (test_util_pkl5.TestMPISelf.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestMPISelf.testISendAndRecv) ... ok testIrecv (test_util_pkl5.TestMPISelf.testIrecv) ... ok testMProbe (test_util_pkl5.TestMPISelf.testMProbe) ... ok testMessage (test_util_pkl5.TestMPISelf.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestMPISelf.testMessageProbeIProbe) ... ok testPingPong01 (test_util_pkl5.TestMPISelf.testPingPong01) ... ok testProbe (test_util_pkl5.TestMPISelf.testProbe) ... ok testRequest (test_util_pkl5.TestMPISelf.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestMPISelf.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestMPISelf.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestMPISelf.testScatterInter) ... skipped 'comm.size==1' testScatterIntra (test_util_pkl5.TestMPISelf.testScatterIntra) ... ok testSendAndRecv (test_util_pkl5.TestMPISelf.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestMPISelf.testSendrecv) ... ok testTestAll (test_util_pkl5.TestMPISelf.testTestAll) ... ok testWaitAll (test_util_pkl5.TestMPISelf.testWaitAll) ... ok testAllgatherInter (test_util_pkl5.TestMPIWorld.testAllgatherInter) ... skipped 'comm.size==1' testAllgatherIntra (test_util_pkl5.TestMPIWorld.testAllgatherIntra) ... ok testAlltoallInter (test_util_pkl5.TestMPIWorld.testAlltoallInter) ... skipped 'comm.size==1' testAlltoallIntra (test_util_pkl5.TestMPIWorld.testAlltoallIntra) ... ok testBSendAndRecv (test_util_pkl5.TestMPIWorld.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestMPIWorld.testBcastInter) ... skipped 'comm.size==1' testBcastIntra (test_util_pkl5.TestMPIWorld.testBcastIntra) ... ok testBigMPI (test_util_pkl5.TestMPIWorld.testBigMPI) ... skipped 'comm.size==1' testGatherInter (test_util_pkl5.TestMPIWorld.testGatherInter) ... skipped 'comm.size==1' testGatherIntra (test_util_pkl5.TestMPIWorld.testGatherIntra) ... ok testGetStatusAll (test_util_pkl5.TestMPIWorld.testGetStatusAll) ... ok testIBSendAndRecv (test_util_pkl5.TestMPIWorld.testIBSendAndRecv) ... ok testIMProbe (test_util_pkl5.TestMPIWorld.testIMProbe) ... ok testISSendAndRecv (test_util_pkl5.TestMPIWorld.testISSendAndRecv) ... ok testISSendCancel (test_util_pkl5.TestMPIWorld.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestMPIWorld.testISendAndRecv) ... ok testIrecv (test_util_pkl5.TestMPIWorld.testIrecv) ... ok testMProbe (test_util_pkl5.TestMPIWorld.testMProbe) ... ok testMessage (test_util_pkl5.TestMPIWorld.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestMPIWorld.testMessageProbeIProbe) ... ok testPingPong01 (test_util_pkl5.TestMPIWorld.testPingPong01) ... ok testProbe (test_util_pkl5.TestMPIWorld.testProbe) ... ok testRequest (test_util_pkl5.TestMPIWorld.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestMPIWorld.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestMPIWorld.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestMPIWorld.testScatterInter) ... skipped 'comm.size==1' testScatterIntra (test_util_pkl5.TestMPIWorld.testScatterIntra) ... ok testSendAndRecv (test_util_pkl5.TestMPIWorld.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestMPIWorld.testSendrecv) ... ok testTestAll (test_util_pkl5.TestMPIWorld.testTestAll) ... ok testWaitAll (test_util_pkl5.TestMPIWorld.testWaitAll) ... ok testAllgatherInter (test_util_pkl5.TestPKL5Self.testAllgatherInter) ... skipped 'comm.size==1' testAllgatherIntra (test_util_pkl5.TestPKL5Self.testAllgatherIntra) ... ok testAlltoallInter (test_util_pkl5.TestPKL5Self.testAlltoallInter) ... skipped 'comm.size==1' testAlltoallIntra (test_util_pkl5.TestPKL5Self.testAlltoallIntra) ... ok testBSendAndRecv (test_util_pkl5.TestPKL5Self.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestPKL5Self.testBcastInter) ... skipped 'comm.size==1' testBcastIntra (test_util_pkl5.TestPKL5Self.testBcastIntra) ... ok testBigMPI (test_util_pkl5.TestPKL5Self.testBigMPI) ... skipped 'comm.size==1' testGatherInter (test_util_pkl5.TestPKL5Self.testGatherInter) ... skipped 'comm.size==1' testGatherIntra (test_util_pkl5.TestPKL5Self.testGatherIntra) ... ok testGetStatusAll (test_util_pkl5.TestPKL5Self.testGetStatusAll) ... ok testIBSendAndRecv (test_util_pkl5.TestPKL5Self.testIBSendAndRecv) ... ok testIMProbe (test_util_pkl5.TestPKL5Self.testIMProbe) ... ok testISSendAndRecv (test_util_pkl5.TestPKL5Self.testISSendAndRecv) ... ok testISSendCancel (test_util_pkl5.TestPKL5Self.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestPKL5Self.testISendAndRecv) ... ok testIrecv (test_util_pkl5.TestPKL5Self.testIrecv) ... ok testMProbe (test_util_pkl5.TestPKL5Self.testMProbe) ... ok testMessage (test_util_pkl5.TestPKL5Self.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestPKL5Self.testMessageProbeIProbe) ... ok testPickle5 (test_util_pkl5.TestPKL5Self.testPickle5) ... ok testPingPong01 (test_util_pkl5.TestPKL5Self.testPingPong01) ... ok testProbe (test_util_pkl5.TestPKL5Self.testProbe) ... ok testRequest (test_util_pkl5.TestPKL5Self.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestPKL5Self.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestPKL5Self.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestPKL5Self.testScatterInter) ... skipped 'comm.size==1' testScatterIntra (test_util_pkl5.TestPKL5Self.testScatterIntra) ... ok testSendAndRecv (test_util_pkl5.TestPKL5Self.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestPKL5Self.testSendrecv) ... ok testTestAll (test_util_pkl5.TestPKL5Self.testTestAll) ... ok testWaitAll (test_util_pkl5.TestPKL5Self.testWaitAll) ... ok testAllgatherInter (test_util_pkl5.TestPKL5World.testAllgatherInter) ... skipped 'comm.size==1' testAllgatherIntra (test_util_pkl5.TestPKL5World.testAllgatherIntra) ... ok testAlltoallInter (test_util_pkl5.TestPKL5World.testAlltoallInter) ... skipped 'comm.size==1' testAlltoallIntra (test_util_pkl5.TestPKL5World.testAlltoallIntra) ... ok testBSendAndRecv (test_util_pkl5.TestPKL5World.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestPKL5World.testBcastInter) ... skipped 'comm.size==1' testBcastIntra (test_util_pkl5.TestPKL5World.testBcastIntra) ... ok testBigMPI (test_util_pkl5.TestPKL5World.testBigMPI) ... skipped 'comm.size==1' testGatherInter (test_util_pkl5.TestPKL5World.testGatherInter) ... skipped 'comm.size==1' testGatherIntra (test_util_pkl5.TestPKL5World.testGatherIntra) ... ok testGetStatusAll (test_util_pkl5.TestPKL5World.testGetStatusAll) ... ok testIBSendAndRecv (test_util_pkl5.TestPKL5World.testIBSendAndRecv) ... ok testIMProbe (test_util_pkl5.TestPKL5World.testIMProbe) ... ok testISSendAndRecv (test_util_pkl5.TestPKL5World.testISSendAndRecv) ... ok testISSendCancel (test_util_pkl5.TestPKL5World.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestPKL5World.testISendAndRecv) ... ok testIrecv (test_util_pkl5.TestPKL5World.testIrecv) ... ok testMProbe (test_util_pkl5.TestPKL5World.testMProbe) ... ok testMessage (test_util_pkl5.TestPKL5World.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestPKL5World.testMessageProbeIProbe) ... ok testPickle5 (test_util_pkl5.TestPKL5World.testPickle5) ... ok testPingPong01 (test_util_pkl5.TestPKL5World.testPingPong01) ... ok testProbe (test_util_pkl5.TestPKL5World.testProbe) ... ok testRequest (test_util_pkl5.TestPKL5World.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestPKL5World.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestPKL5World.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestPKL5World.testScatterInter) ... skipped 'comm.size==1' testScatterIntra (test_util_pkl5.TestPKL5World.testScatterIntra) ... ok testSendAndRecv (test_util_pkl5.TestPKL5World.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestPKL5World.testSendrecv) ... ok testTestAll (test_util_pkl5.TestPKL5World.testTestAll) ... ok testWaitAll (test_util_pkl5.TestPKL5World.testWaitAll) ... ok test_apply (test_util_pool.TestProcessPool.test_apply) ... ok test_apply_async (test_util_pool.TestProcessPool.test_apply_async) ... ok test_apply_async_timeout (test_util_pool.TestProcessPool.test_apply_async_timeout) ... ok test_arg_initializer (test_util_pool.TestProcessPool.test_arg_initializer) ... ok test_arg_processes (test_util_pool.TestProcessPool.test_arg_processes) ... ok test_async_error_callback (test_util_pool.TestProcessPool.test_async_error_callback) ... ok test_empty_iterable (test_util_pool.TestProcessPool.test_empty_iterable) ... ok test_enter_exit (test_util_pool.TestProcessPool.test_enter_exit) ... ok test_imap (test_util_pool.TestProcessPool.test_imap) ... ok test_imap_unordered (test_util_pool.TestProcessPool.test_imap_unordered) ... ok test_istarmap (test_util_pool.TestProcessPool.test_istarmap) ... ok test_istarmap_unordered (test_util_pool.TestProcessPool.test_istarmap_unordered) ... ok test_map (test_util_pool.TestProcessPool.test_map) ... ok test_map_async (test_util_pool.TestProcessPool.test_map_async) ... ok test_map_async_callbacks (test_util_pool.TestProcessPool.test_map_async_callbacks) ... ok test_pool_worker_lifetime_early_close (test_util_pool.TestProcessPool.test_pool_worker_lifetime_early_close) ... ok test_starmap (test_util_pool.TestProcessPool.test_starmap) ... ok test_starmap_async (test_util_pool.TestProcessPool.test_starmap_async) ... ok test_terminate (test_util_pool.TestProcessPool.test_terminate) ... ok test_unsupported_args (test_util_pool.TestProcessPool.test_unsupported_args) ... ok test_apply (test_util_pool.TestThreadPool.test_apply) ... ok test_apply_async (test_util_pool.TestThreadPool.test_apply_async) ... ok test_apply_async_timeout (test_util_pool.TestThreadPool.test_apply_async_timeout) ... ok test_arg_initializer (test_util_pool.TestThreadPool.test_arg_initializer) ... ok test_arg_processes (test_util_pool.TestThreadPool.test_arg_processes) ... ok test_async_error_callback (test_util_pool.TestThreadPool.test_async_error_callback) ... ok test_empty_iterable (test_util_pool.TestThreadPool.test_empty_iterable) ... ok test_enter_exit (test_util_pool.TestThreadPool.test_enter_exit) ... ok test_imap (test_util_pool.TestThreadPool.test_imap) ... ok test_imap_unordered (test_util_pool.TestThreadPool.test_imap_unordered) ... ok test_istarmap (test_util_pool.TestThreadPool.test_istarmap) ... ok test_istarmap_unordered (test_util_pool.TestThreadPool.test_istarmap_unordered) ... ok test_map (test_util_pool.TestThreadPool.test_map) ... ok test_map_async (test_util_pool.TestThreadPool.test_map_async) ... ok test_map_async_callbacks (test_util_pool.TestThreadPool.test_map_async_callbacks) ... ok test_pool_worker_lifetime_early_close (test_util_pool.TestThreadPool.test_pool_worker_lifetime_early_close) ... ok test_starmap (test_util_pool.TestThreadPool.test_starmap) ... ok test_starmap_async (test_util_pool.TestThreadPool.test_starmap_async) ... ok test_terminate (test_util_pool.TestThreadPool.test_terminate) ... ok test_unsupported_args (test_util_pool.TestThreadPool.test_unsupported_args) ... ok testAcquireFree (test_util_sync.TestConditionMutexSelf.testAcquireFree) ... ok testFree (test_util_sync.TestConditionMutexSelf.testFree) ... ok testWaitForNotify (test_util_sync.TestConditionMutexSelf.testWaitForNotify) ... ok testWaitNotify (test_util_sync.TestConditionMutexSelf.testWaitNotify) ... ok testWaitNotifyAll (test_util_sync.TestConditionMutexSelf.testWaitNotifyAll) ... ok testAcquireFree (test_util_sync.TestConditionMutexWorld.testAcquireFree) ... ok testFree (test_util_sync.TestConditionMutexWorld.testFree) ... ok testWaitForNotify (test_util_sync.TestConditionMutexWorld.testWaitForNotify) ... ok testWaitNotify (test_util_sync.TestConditionMutexWorld.testWaitNotify) ... ok testWaitNotifyAll (test_util_sync.TestConditionMutexWorld.testWaitNotifyAll) ... ok testAcquireFree (test_util_sync.TestConditionSelf.testAcquireFree) ... ok testFree (test_util_sync.TestConditionSelf.testFree) ... ok testWaitForNotify (test_util_sync.TestConditionSelf.testWaitForNotify) ... ok testWaitNotify (test_util_sync.TestConditionSelf.testWaitNotify) ... ok testWaitNotifyAll (test_util_sync.TestConditionSelf.testWaitNotifyAll) ... ok testAcquireFree (test_util_sync.TestConditionWorld.testAcquireFree) ... ok testFree (test_util_sync.TestConditionWorld.testFree) ... ok testWaitForNotify (test_util_sync.TestConditionWorld.testWaitForNotify) ... ok testWaitNotify (test_util_sync.TestConditionWorld.testWaitNotify) ... ok testWaitNotifyAll (test_util_sync.TestConditionWorld.testWaitNotifyAll) ... ok testDefault (test_util_sync.TestCounterSelf.testDefault) ... ok testFree (test_util_sync.TestCounterSelf.testFree) ... ok testIter (test_util_sync.TestCounterSelf.testIter) ... ok testNext (test_util_sync.TestCounterSelf.testNext) ... ok testRoot (test_util_sync.TestCounterSelf.testRoot) ... ok testStart (test_util_sync.TestCounterSelf.testStart) ... ok testStep (test_util_sync.TestCounterSelf.testStep) ... ok testTypechar (test_util_sync.TestCounterSelf.testTypechar) ... ok testDefault (test_util_sync.TestCounterWorld.testDefault) ... ok testFree (test_util_sync.TestCounterWorld.testFree) ... ok testIter (test_util_sync.TestCounterWorld.testIter) ... ok testNext (test_util_sync.TestCounterWorld.testNext) ... ok testRoot (test_util_sync.TestCounterWorld.testRoot) ... ok testStart (test_util_sync.TestCounterWorld.testStart) ... ok testStep (test_util_sync.TestCounterWorld.testStep) ... ok testTypechar (test_util_sync.TestCounterWorld.testTypechar) ... ok testAcquireFree (test_util_sync.TestMutexBasicSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestMutexBasicSelf.testAcquireNonblocking) ... ok testAcquireRelease (test_util_sync.TestMutexBasicSelf.testAcquireRelease) ... ok testExclusion (test_util_sync.TestMutexBasicSelf.testExclusion) ... ok testFairness (test_util_sync.TestMutexBasicSelf.testFairness) ... ok testFree (test_util_sync.TestMutexBasicSelf.testFree) ... ok testWith (test_util_sync.TestMutexBasicSelf.testWith) ... ok testAcquireFree (test_util_sync.TestMutexBasicWorld.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestMutexBasicWorld.testAcquireNonblocking) ... ok testAcquireRelease (test_util_sync.TestMutexBasicWorld.testAcquireRelease) ... ok testExclusion (test_util_sync.TestMutexBasicWorld.testExclusion) ... ok testFairness (test_util_sync.TestMutexBasicWorld.testFairness) ... ok testFree (test_util_sync.TestMutexBasicWorld.testFree) ... ok testWith (test_util_sync.TestMutexBasicWorld.testWith) ... ok testAcquireFree (test_util_sync.TestMutexRecursiveSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestMutexRecursiveSelf.testAcquireNonblocking) ... ok testAcquireRelease (test_util_sync.TestMutexRecursiveSelf.testAcquireRelease) ... ok testFree (test_util_sync.TestMutexRecursiveSelf.testFree) ... ok testWith (test_util_sync.TestMutexRecursiveSelf.testWith) ... ok testAcquireFree (test_util_sync.TestMutexRecursiveWorld.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestMutexRecursiveWorld.testAcquireNonblocking) ... ok testAcquireRelease (test_util_sync.TestMutexRecursiveWorld.testAcquireRelease) ... ok testFree (test_util_sync.TestMutexRecursiveWorld.testFree) ... ok testWith (test_util_sync.TestMutexRecursiveWorld.testWith) ... ok testAcquireFree (test_util_sync.TestSemaphoreSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestSemaphoreSelf.testAcquireNonblocking) ... ok testAcquireRelease (test_util_sync.TestSemaphoreSelf.testAcquireRelease) ... ok testBounded (test_util_sync.TestSemaphoreSelf.testBounded) ... ok testFree (test_util_sync.TestSemaphoreSelf.testFree) ... ok testValue (test_util_sync.TestSemaphoreSelf.testValue) ... ok testWith (test_util_sync.TestSemaphoreSelf.testWith) ... ok testAcquireFree (test_util_sync.TestSemaphoreWorld.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestSemaphoreWorld.testAcquireNonblocking) ... ok testAcquireRelease (test_util_sync.TestSemaphoreWorld.testAcquireRelease) ... ok testBounded (test_util_sync.TestSemaphoreWorld.testBounded) ... ok testFree (test_util_sync.TestSemaphoreWorld.testFree) ... ok testValue (test_util_sync.TestSemaphoreWorld.testValue) ... ok testWith (test_util_sync.TestSemaphoreWorld.testWith) ... ok testBeginEnd (test_util_sync.TestSequentialSelf.testBeginEnd) ... ok testWith (test_util_sync.TestSequentialSelf.testWith) ... ok testBeginEnd (test_util_sync.TestSequentialWorld.testBeginEnd) ... ok testWith (test_util_sync.TestSequentialWorld.testWith) ... ok testAttributes (test_win.TestWinAllocateSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinAllocateSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinAllocateSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinAllocateSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinAllocateSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinAllocateSelf.testGetSetName) ... ok testMemory (test_win.TestWinAllocateSelf.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateSelf.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateSelf.testPickle) ... ok testPyProps (test_win.TestWinAllocateSelf.testPyProps) ... ok testAttributes (test_win.TestWinAllocateSharedSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinAllocateSharedSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSharedSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinAllocateSharedSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinAllocateSharedSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinAllocateSharedSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinAllocateSharedSelf.testGetSetName) ... ok testMemory (test_win.TestWinAllocateSharedSelf.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateSharedSelf.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateSharedSelf.testPickle) ... ok testPyProps (test_win.TestWinAllocateSharedSelf.testPyProps) ... ok testSharedQuery (test_win.TestWinAllocateSharedSelf.testSharedQuery) ... ok testAttributes (test_win.TestWinAllocateSharedWorld.testAttributes) ... ok testCreateFlavor (test_win.TestWinAllocateSharedWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSharedWorld.testGetAttr) ... ok testGetGroup (test_win.TestWinAllocateSharedWorld.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinAllocateSharedWorld.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinAllocateSharedWorld.testGetSetInfo) ... ok testGetSetName (test_win.TestWinAllocateSharedWorld.testGetSetName) ... ok testMemory (test_win.TestWinAllocateSharedWorld.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateSharedWorld.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateSharedWorld.testPickle) ... ok testPyProps (test_win.TestWinAllocateSharedWorld.testPyProps) ... ok testSharedQuery (test_win.TestWinAllocateSharedWorld.testSharedQuery) ... ok testAttributes (test_win.TestWinAllocateWorld.testAttributes) ... ok testCreateFlavor (test_win.TestWinAllocateWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateWorld.testGetAttr) ... ok testGetGroup (test_win.TestWinAllocateWorld.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinAllocateWorld.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinAllocateWorld.testGetSetInfo) ... ok testGetSetName (test_win.TestWinAllocateWorld.testGetSetName) ... ok testMemory (test_win.TestWinAllocateWorld.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateWorld.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateWorld.testPickle) ... ok testPyProps (test_win.TestWinAllocateWorld.testPyProps) ... ok testAttachDetach (test_win.TestWinCreateDynamicSelf.testAttachDetach) ... ok testAttributes (test_win.TestWinCreateDynamicSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateDynamicSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateDynamicSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateDynamicSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateDynamicSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinCreateDynamicSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateDynamicSelf.testGetSetName) ... ok testMemory (test_win.TestWinCreateDynamicSelf.testMemory) ... ok testMemoryModel (test_win.TestWinCreateDynamicSelf.testMemoryModel) ... ok testPickle (test_win.TestWinCreateDynamicSelf.testPickle) ... ok testPyProps (test_win.TestWinCreateDynamicSelf.testPyProps) ... ok testAttachDetach (test_win.TestWinCreateDynamicWorld.testAttachDetach) ... ok testAttributes (test_win.TestWinCreateDynamicWorld.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateDynamicWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateDynamicWorld.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateDynamicWorld.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateDynamicWorld.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinCreateDynamicWorld.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateDynamicWorld.testGetSetName) ... ok testMemory (test_win.TestWinCreateDynamicWorld.testMemory) ... ok testMemoryModel (test_win.TestWinCreateDynamicWorld.testMemoryModel) ... ok testPickle (test_win.TestWinCreateDynamicWorld.testPickle) ... ok testPyProps (test_win.TestWinCreateDynamicWorld.testPyProps) ... ok testAttributes (test_win.TestWinCreateSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinCreateSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateSelf.testGetSetName) ... ok testMemory (test_win.TestWinCreateSelf.testMemory) ... ok testMemoryModel (test_win.TestWinCreateSelf.testMemoryModel) ... ok testPickle (test_win.TestWinCreateSelf.testPickle) ... ok testPyProps (test_win.TestWinCreateSelf.testPyProps) ... ok testAttributes (test_win.TestWinCreateWorld.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateWorld.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateWorld.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateWorld.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinCreateWorld.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateWorld.testGetSetName) ... ok testMemory (test_win.TestWinCreateWorld.testMemory) ... ok testMemoryModel (test_win.TestWinCreateWorld.testMemoryModel) ... ok testPickle (test_win.TestWinCreateWorld.testPickle) ... ok testPyProps (test_win.TestWinCreateWorld.testPyProps) ... ok testConstructor (test_win.TestWinNull.testConstructor) ... ok testGetName (test_win.TestWinNull.testGetName) ... ok ---------------------------------------------------------------------- Ran 2081 tests in 236.942s OK (skipped=182) I: running tests with MPI (4 processes) host: virt32a [mpiexec@virt32a] Timeout set to -1 (-1 means infinite) ================================================================================================== mpiexec options: ---------------- Base path: /usr/bin/ Launcher: (null) Debug level: 1 Enable X: -1 Global environment: ------------------- DEB_HOST_GNU_SYSTEM=linux-gnueabihf SUDO_GID=113 OMPI_MCA_plm=isolated DFLAGS=-frelease DEB_BUILD_ARCH_BITS=32 DEB_TARGET_GNU_CPU=arm MAIL=/var/mail/root DEB_HOST_ARCH_OS=linux LANGUAGE=en_US:en USER=pbuilder1 OMPI_MCA_rmaps_base_oversubscribe=true ASFLAGS_FOR_BUILD= CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf BUILDUSERNAME=pbuilder1 FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection DEB_TARGET_MULTIARCH=arm-linux-gnueabihf OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild DEB_BUILD_ARCH_CPU=arm SHLVL=2 DEB_HOST_ARCH_LIBC=gnu DEB_HOST_ARCH_ABI=eabihf OLDPWD=/ BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other HOME=/nonexistent/first-build DEB_BUILD_ARCH_ENDIAN=little DFLAGS_FOR_BUILD=-frelease LDFLAGS=-Wl,-z,relro DEB_TARGET_ARCH_BITS=32 DEB_BUILD_GNU_SYSTEM=linux-gnueabihf MAKEFLAGS=w PBUILDER_OPERATION=build CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security OMPI_MCA_btl_vader_single_copy_mechanism=none MPI4PY_TEST_SPAWN=false OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security DEB_BUILD_ARCH_OS=linux DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf SUDO_UID=107 PBUILDER_PKGDATADIR=/usr/share/pbuilder DEB_TARGET_ARCH_CPU=arm https_proxy=http://127.0.0.1:9/ LOGNAME=pbuilder1 http_proxy=http://127.0.0.1:9/ DEB_BUILD_ARCH_LIBC=gnu DEB_BUILD_ARCH_ABI=eabihf PBUILDER_SYSCONFDIR=/etc _=/usr/bin/unshare OMPI_MCA_btl_base_warn_component_unused=false DEB_HOST_ARCH=armhf LDFLAGS_FOR_BUILD=-Wl,-z,relro DEB_TARGET_ARCH_ENDIAN=little TERM=unknown DH_INTERNAL_OVERRIDE=dh_auto_test DEB_HOST_GNU_CPU=arm DEB_TARGET_GNU_SYSTEM=linux-gnueabihf PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games INVOCATION_ID=610244979ed544ba900474fb6a7bad85 DEB_TARGET_ARCH_OS=linux CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security MAKELEVEL=2 DEB_HOST_MULTIARCH=arm-linux-gnueabihf SOURCE_DATE_EPOCH=1727596276 FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security LANG=C LD_PRELOAD= DEB_TARGET_ARCH_LIBC=gnu DEB_TARGET_ARCH_ABI=eabihf PBUILDER_PKGLIBDIR=/usr/lib/pbuilder DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 DEBIAN_FRONTEND=noninteractive DH_INTERNAL_BUILDFLAGS=1 SHELL=/bin/bash GITHUB_ACTIONS=true DEB_HOST_ARCH_BITS=32 DEB_BUILD_ARCH=armhf SUDO_USER=jenkins CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security DEB_BUILD_GNU_CPU=arm ASFLAGS= DEB_HOST_GNU_TYPE=arm-linux-gnueabihf FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection LC_ALL=C OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security PWD=/build/reproducible-path/mpi4py-4.0.0 DEB_HOST_ARCH_CPU=arm DEB_RULES_REQUIRES_ROOT=no FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection DEB_BUILD_MULTIARCH=arm-linux-gnueabihf CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2 PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build MFLAGS=-w TZ=/usr/share/zoneinfo/Etc/GMT+12 PBCURRENTCOMMANDLINEOPERATION=build DEB_HOST_ARCH_ENDIAN=little DEB_TARGET_ARCH=armhf Hydra internal environment: --------------------------- GFORTRAN_UNBUFFERED_PRECONNECTED=y Proxy information: ********************* [1] proxy: virt32a (1 cores) Exec list: /usr/bin/python3.12 (4 processes); ================================================================================================== Proxy launch args: /usr/bin/hydra_pmi_proxy --control-port virt32a:39293 --debug --rmk user --launcher ssh --demux poll --pgid 0 --retries 10 --usize -2 --pmi-port 0 --gpus-per-proc -2 --gpu-subdevs-per-proc -2 --proxy-id Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 0 --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/runtests.py -fv [mpiexec@virt32a] Launch arguments: /usr/bin/hydra_pmi_proxy --control-port virt32a:39293 --debug --rmk user --launcher ssh --demux poll --pgid 0 --retries 10 --usize -2 --pmi-port 0 --gpus-per-proc -2 --gpu-subdevs-per-proc -2 --proxy-id 0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_0_289500754_virt32a [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_0_289500754_virt32a [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_0_289500754_virt32a [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_0_289500754_virt32a [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=put kvsname=kvs_27847_0_289500754_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70544A4978327600 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70544A4978327600 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70544A4978327600 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70544A4978327600 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70544A4978327600 [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=keyval_cache -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70544A4978327600 [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70544A4978327600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70544A4978327600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70544A4978327600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=put kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard value=description#virt32a$port#45101$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#45101$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=put kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard value=description#virt32a$port#34485$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#34485$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=put kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard value=description#virt32a$port#35709$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#35709$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=put kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard value=description#virt32a$port#47245$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#47245$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#45101$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34485$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#35709$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#47245$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#45101$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34485$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#35709$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#47245$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#45101$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34485$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#35709$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#47245$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#45101$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34485$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#35709$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#47245$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [0@virt32a] Python 3.12.6 (/usr/bin/python3.12) [0@virt32a] numpy 1.26.4 (/usr/lib/python3/dist-packages/numpy) [0@virt32a] MPI 4.1 (MPICH 4.2.0) [0@virt32a] mpi4py 4.0.0 (/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py) testAintAdd (test_address.TestAddress.testAintAdd) ... ok testAintDiff (test_address.TestAddress.testAintDiff) ... ok testBottom (test_address.TestAddress.testBottom) ... ok testGetAddress1 (test_address.TestAddress.testGetAddress1) ... ok testGetAddress2 (test_address.TestAddress.testGetAddress2) ... ok testNone (test_address.TestAddress.testNone) ... ok testAttr (test_attributes.TestCommAttrSelf.testAttr) ... ok testAttrCopyDelete (test_attributes.TestCommAttrSelf.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestCommAttrSelf.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestCommAttrSelf.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestCommAttrSelf.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestCommAttrSelf.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestCommAttrSelf.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestCommAttrSelf.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestCommAttrSelf.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestCommAttrSelf.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestCommAttrWorld.testAttr) ... [1@virt32a] Python 3.12.6 (/usr/bin/python3.12) [1@virt32a] numpy 1.26.4 (/usr/lib/python3/dist-packages/numpy) [1@virt32a] MPI 4.1 (MPICH 4.2.0) [1@virt32a] mpi4py 4.0.0 (/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py) testAintAdd (test_address.TestAddress.testAintAdd) ... ok testAintDiff (test_address.TestAddress.testAintDiff) ... ok testBottom (test_address.TestAddress.testBottom) ... ok testGetAddress1 (test_address.TestAddress.testGetAddress1) ... ok testGetAddress2 (test_address.TestAddress.testGetAddress2) ... ok testNone (test_address.TestAddress.testNone) ... ok testAttr (test_attributes.TestCommAttrSelf.testAttr) ... ok testAttrCopyDelete (test_attributes.TestCommAttrSelf.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestCommAttrSelf.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestCommAttrSelf.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestCommAttrSelf.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestCommAttrSelf.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestCommAttrSelf.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestCommAttrSelf.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestCommAttrSelf.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestCommAttrSelf.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestCommAttrWorld.testAttr) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [2@virt32a] Python 3.12.6 (/usr/bin/python3.12) [2@virt32a] numpy 1.26.4 (/usr/lib/python3/dist-packages/numpy) [2@virt32a] MPI 4.1 (MPICH 4.2.0) [2@virt32a] mpi4py 4.0.0 (/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py) testAintAdd (test_address.TestAddress.testAintAdd) ... ok testAintDiff (test_address.TestAddress.testAintDiff) ... ok testBottom (test_address.TestAddress.testBottom) ... ok testGetAddress1 (test_address.TestAddress.testGetAddress1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetAddress2 (test_address.TestAddress.testGetAddress2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNone (test_address.TestAddress.testNone) ... ok testAttr (test_attributes.TestCommAttrSelf.testAttr) ... ok testAttrCopyDelete (test_attributes.TestCommAttrSelf.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestCommAttrSelf.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestCommAttrSelf.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestCommAttrSelf.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestCommAttrSelf.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestCommAttrSelf.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestCommAttrSelf.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestCommAttrSelf.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestCommAttrSelf.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestCommAttrWorld.testAttr) ... [3@virt32a] Python 3.12.6 (/usr/bin/python3.12) [3@virt32a] numpy 1.26.4 (/usr/lib/python3/dist-packages/numpy) [3@virt32a] MPI 4.1 (MPICH 4.2.0) [3@virt32a] mpi4py 4.0.0 (/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py) testAintAdd (test_address.TestAddress.testAintAdd) ... ok testAintDiff (test_address.TestAddress.testAintDiff) ... ok testBottom (test_address.TestAddress.testBottom) ... ok testGetAddress1 (test_address.TestAddress.testGetAddress1) ... ok testGetAddress2 (test_address.TestAddress.testGetAddress2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNone (test_address.TestAddress.testNone) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttr (test_attributes.TestCommAttrSelf.testAttr) ... ok testAttrCopyDelete (test_attributes.TestCommAttrSelf.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestCommAttrSelf.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestCommAttrSelf.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestCommAttrSelf.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestCommAttrSelf.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestCommAttrSelf.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestCommAttrSelf.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestCommAttrSelf.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestCommAttrSelf.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestCommAttrWorld.testAttr) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrCopyDelete (test_attributes.TestCommAttrWorld.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestCommAttrWorld.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestCommAttrWorld.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestCommAttrWorld.testAttrCopyTrue) ... ok testAttrCopyDelete (test_attributes.TestCommAttrWorld.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestCommAttrWorld.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestCommAttrWorld.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestCommAttrWorld.testAttrCopyTrue) ... ok testAttrCopyDelete (test_attributes.TestCommAttrWorld.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestCommAttrWorld.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestCommAttrWorld.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestCommAttrWorld.testAttrCopyTrue) ... ok testAttrCopyDelete (test_attributes.TestCommAttrWorld.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestCommAttrWorld.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestCommAttrWorld.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestCommAttrWorld.testAttrCopyTrue) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrDeleteException (test_attributes.TestCommAttrWorld.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestCommAttrWorld.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestCommAttrWorld.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestCommAttrWorld.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestCommAttrWorld.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestDatatypeAttrBYTE.testAttr) ... ok testAttrCopyDelete (test_attributes.TestDatatypeAttrBYTE.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestDatatypeAttrBYTE.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrBYTE.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrBYTE.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestDatatypeAttrBYTE.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestDatatypeAttrBYTE.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestDatatypeAttrBYTE.testAttrNoPython) ... ok ok testAttrDeleteException (test_attributes.TestCommAttrWorld.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestCommAttrWorld.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestCommAttrWorld.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestCommAttrWorld.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestCommAttrWorld.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestDatatypeAttrBYTE.testAttr) ... ok testAttrCopyDelete (test_attributes.TestDatatypeAttrBYTE.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestDatatypeAttrBYTE.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrBYTE.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrBYTE.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestDatatypeAttrBYTE.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestDatatypeAttrBYTE.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestDatatypeAttrBYTE.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestDatatypeAttrBYTE.testAttrNoPythonArray) ... ok ok testAttrDeleteException (test_attributes.TestCommAttrWorld.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestCommAttrWorld.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestCommAttrWorld.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestCommAttrWorld.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestCommAttrWorld.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestDatatypeAttrBYTE.testAttr) ... ok testAttrCopyDelete (test_attributes.TestDatatypeAttrBYTE.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestDatatypeAttrBYTE.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrBYTE.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrBYTE.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestDatatypeAttrBYTE.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestDatatypeAttrBYTE.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestDatatypeAttrBYTE.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestDatatypeAttrBYTE.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestDatatypeAttrBYTE.testAttrNoPythonZero) ... ok testAttrDeleteException (test_attributes.TestCommAttrWorld.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestCommAttrWorld.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestCommAttrWorld.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestCommAttrWorld.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestCommAttrWorld.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestDatatypeAttrBYTE.testAttr) ... ok testAttrCopyDelete (test_attributes.TestDatatypeAttrBYTE.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestDatatypeAttrBYTE.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrBYTE.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrBYTE.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestDatatypeAttrBYTE.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestDatatypeAttrBYTE.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestDatatypeAttrBYTE.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestDatatypeAttrBYTE.testAttrNoPythonArray) ... testAttrNoPythonArray (test_attributes.TestDatatypeAttrBYTE.testAttrNoPythonArray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttr (test_attributes.TestDatatypeAttrFLOAT.testAttr) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrNoPythonZero (test_attributes.TestDatatypeAttrBYTE.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestDatatypeAttrFLOAT.testAttr) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrCopyDelete (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyDelete) ... testAttrCopyDelete (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyDelete) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testAttrCopyException (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyException) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrCopyException (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyFalse) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR skipped 'mpich(<4.2.1)' [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrNoPythonZero (test_attributes.TestDatatypeAttrBYTE.testAttrNoPythonZero) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrCopyFalse (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyTrue) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestDatatypeAttrFLOAT.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrNoCopy (test_attributes.TestDatatypeAttrFLOAT.testAttrNoCopy) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrDeleteException (test_attributes.TestDatatypeAttrFLOAT.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' ok testAttrNoPython (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPython) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrNoCopy (test_attributes.TestDatatypeAttrFLOAT.testAttrNoCopy) ... ok ok testAttrNoPython (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPython) ... testAttrNoPythonArray (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPythonArray) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testAttrNoPythonArray (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPythonArray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrNoPythonZero (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPythonZero) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrNoPythonZero (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPythonZero) ... testAttr (test_attributes.TestDatatypeAttrINT.testAttr) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttr (test_attributes.TestDatatypeAttrFLOAT.testAttr) ... ok testAttrCopyDelete (test_attributes.TestDatatypeAttrINT.testAttrCopyDelete) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttr (test_attributes.TestDatatypeAttrINT.testAttr) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrCopyException (test_attributes.TestDatatypeAttrINT.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrINT.testAttrCopyFalse) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrCopyDelete (test_attributes.TestDatatypeAttrINT.testAttrCopyDelete) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrCopyTrue (test_attributes.TestDatatypeAttrINT.testAttrCopyTrue) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrCopyException (test_attributes.TestDatatypeAttrINT.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrINT.testAttrCopyFalse) ... ok testAttrDeleteException (test_attributes.TestDatatypeAttrINT.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestDatatypeAttrINT.testAttrNoCopy) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrCopyDelete (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyDelete) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrINT.testAttrCopyTrue) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrNoPython (test_attributes.TestDatatypeAttrINT.testAttrNoPython) ... ok ok testAttrDeleteException (test_attributes.TestDatatypeAttrINT.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrNoPythonArray (test_attributes.TestDatatypeAttrINT.testAttrNoPythonArray) ... testAttrNoCopy (test_attributes.TestDatatypeAttrINT.testAttrNoCopy) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrNoPythonZero (test_attributes.TestDatatypeAttrINT.testAttrNoPythonZero) ... ok testAttrNoPython (test_attributes.TestDatatypeAttrINT.testAttrNoPython) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testAttrNoPythonArray (test_attributes.TestDatatypeAttrINT.testAttrNoPythonArray) ... testAttr (test_attributes.TestWinAttr.testAttr) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrCopyException (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyException) ... skipped 'mpich(<4.2.1)' [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrNoPythonZero (test_attributes.TestDatatypeAttrINT.testAttrNoPythonZero) ... testAttrCopyDelete (test_attributes.TestWinAttr.testAttrCopyDelete) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrCopyFalse (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyFalse) ... ok testAttr (test_attributes.TestWinAttr.testAttr) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrCopyException (test_attributes.TestWinAttr.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestWinAttr.testAttrCopyFalse) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrCopyTrue (test_attributes.TestWinAttr.testAttrCopyTrue) ... testAttrCopyDelete (test_attributes.TestWinAttr.testAttrCopyDelete) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrDeleteException (test_attributes.TestWinAttr.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrNoCopy (test_attributes.TestWinAttr.testAttrNoCopy) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrCopyTrue (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyTrue) ... testAttrCopyException (test_attributes.TestWinAttr.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestWinAttr.testAttrCopyFalse) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrNoPython (test_attributes.TestWinAttr.testAttrNoPython) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrCopyTrue (test_attributes.TestWinAttr.testAttrCopyTrue) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrNoPythonArray (test_attributes.TestWinAttr.testAttrNoPythonArray) ... ok ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrDeleteException (test_attributes.TestWinAttr.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestWinAttr.testAttrNoCopy) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrNoPythonZero (test_attributes.TestWinAttr.testAttrNoPythonZero) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrDeleteException (test_attributes.TestDatatypeAttrFLOAT.testAttrDeleteException) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttrNoPython (test_attributes.TestWinAttr.testAttrNoPython) ... testAllocate (test_buffer.TestBuffer.testAllocate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR skipped 'mpich(<4.2.1)' ok testAttrNoPythonArray (test_attributes.TestWinAttr.testAttrNoPythonArray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttachBufferReadonly (test_buffer.TestBuffer.testAttachBufferReadonly) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttrNoCopy (test_attributes.TestDatatypeAttrFLOAT.testAttrNoCopy) ... ok testAttrNoPythonZero (test_attributes.TestWinAttr.testAttrNoPythonZero) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffering (test_buffer.TestBuffer.testBuffering) ... ok ok testCast (test_buffer.TestBuffer.testCast) ... testAllocate (test_buffer.TestBuffer.testAllocate) ... ok testFromAddress (test_buffer.TestBuffer.testFromAddress) ... ok testAttrNoPython (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPython) ... ok testAttachBufferReadonly (test_buffer.TestBuffer.testAttachBufferReadonly) ... ok ok testFromBufferArrayRO (test_buffer.TestBuffer.testFromBufferArrayRO) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testBuffering (test_buffer.TestBuffer.testBuffering) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFromBufferArrayRW (test_buffer.TestBuffer.testFromBufferArrayRW) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCast (test_buffer.TestBuffer.testCast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFromBufferBad (test_buffer.TestBuffer.testFromBufferBad) ... ok testFromAddress (test_buffer.TestBuffer.testFromAddress) ... ok ok ok testFromBufferBytes (test_buffer.TestBuffer.testFromBufferBytes) ... ok testFromBufferArrayRO (test_buffer.TestBuffer.testFromBufferArrayRO) ... testAttrNoPythonArray (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPythonArray) ... ok testNewArray (test_buffer.TestBuffer.testNewArray) ... ok testNewBad (test_buffer.TestBuffer.testNewBad) ... testFromBufferArrayRW (test_buffer.TestBuffer.testFromBufferArrayRW) ... ok testFromBufferBad (test_buffer.TestBuffer.testFromBufferBad) ... ok ok testNewBytearray (test_buffer.TestBuffer.testNewBytearray) ... ok testAttrNoPythonZero (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPythonZero) ... testNewBytes (test_buffer.TestBuffer.testNewBytes) ... ok ok testNewEmpty (test_buffer.TestBuffer.testNewEmpty) ... testFromBufferBytes (test_buffer.TestBuffer.testFromBufferBytes) ... ok ok testSequence (test_buffer.TestBuffer.testSequence) ... testNewArray (test_buffer.TestBuffer.testNewArray) ... ok testNewBad (test_buffer.TestBuffer.testNewBad) ... ok ok testToReadonly (test_buffer.TestBuffer.testToReadonly) ... ok testNewBytearray (test_buffer.TestBuffer.testNewBytearray) ... ok testAllgather (test_cco_buf.TestCCOBufInplaceSelf.testAllgather) ... ok testNewBytes (test_buffer.TestBuffer.testNewBytes) ... ok testAttr (test_attributes.TestDatatypeAttrINT.testAttr) ... testNewEmpty (test_buffer.TestBuffer.testNewEmpty) ... ok testSequence (test_buffer.TestBuffer.testSequence) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testToReadonly (test_buffer.TestBuffer.testToReadonly) ... ok ok testAllgather (test_cco_buf.TestCCOBufInplaceSelf.testAllgather) ... testAttrCopyDelete (test_attributes.TestDatatypeAttrINT.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestDatatypeAttrINT.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrINT.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrINT.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestDatatypeAttrINT.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestDatatypeAttrINT.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestDatatypeAttrINT.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestDatatypeAttrINT.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestDatatypeAttrINT.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestWinAttr.testAttr) ... ok testAttrCopyDelete (test_attributes.TestWinAttr.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestWinAttr.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestWinAttr.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestWinAttr.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestWinAttr.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestWinAttr.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestWinAttr.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestWinAttr.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestWinAttr.testAttrNoPythonZero) ... ok testAllocate (test_buffer.TestBuffer.testAllocate) ... ok testAttachBufferReadonly (test_buffer.TestBuffer.testAttachBufferReadonly) ... ok testBuffering (test_buffer.TestBuffer.testBuffering) ... ok testCast (test_buffer.TestBuffer.testCast) ... ok testFromAddress (test_buffer.TestBuffer.testFromAddress) ... ok testFromBufferArrayRO (test_buffer.TestBuffer.testFromBufferArrayRO) ... ok testFromBufferArrayRW (test_buffer.TestBuffer.testFromBufferArrayRW) ... ok testFromBufferBad (test_buffer.TestBuffer.testFromBufferBad) ... testAttrNoPythonZero (test_attributes.TestDatatypeAttrBYTE.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestDatatypeAttrFLOAT.testAttr) ... ok testAttrCopyDelete (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrFLOAT.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestDatatypeAttrFLOAT.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestDatatypeAttrFLOAT.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestDatatypeAttrFLOAT.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestDatatypeAttrINT.testAttr) ... ok testAttrCopyDelete (test_attributes.TestDatatypeAttrINT.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestDatatypeAttrINT.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestDatatypeAttrINT.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestDatatypeAttrINT.testAttrCopyTrue) ... ok testAttrDeleteException (test_attributes.TestDatatypeAttrINT.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestDatatypeAttrINT.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestDatatypeAttrINT.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestDatatypeAttrINT.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestDatatypeAttrINT.testAttrNoPythonZero) ... ok testAttr (test_attributes.TestWinAttr.testAttr) ... ok testAttrCopyDelete (test_attributes.TestWinAttr.testAttrCopyDelete) ... ok testAttrCopyException (test_attributes.TestWinAttr.testAttrCopyException) ... skipped 'mpich(<4.2.1)' testAttrCopyFalse (test_attributes.TestWinAttr.testAttrCopyFalse) ... ok testAttrCopyTrue (test_attributes.TestWinAttr.testAttrCopyTrue) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFromBufferBytes (test_buffer.TestBuffer.testFromBufferBytes) ... ok testNewArray (test_buffer.TestBuffer.testNewArray) ... ok testNewBad (test_buffer.TestBuffer.testNewBad) ... ok testNewBytearray (test_buffer.TestBuffer.testNewBytearray) ... ok testNewBytes (test_buffer.TestBuffer.testNewBytes) ... ok testNewEmpty (test_buffer.TestBuffer.testNewEmpty) ... ok testSequence (test_buffer.TestBuffer.testSequence) ... ok testToReadonly (test_buffer.TestBuffer.testToReadonly) ... ok testAllgather (test_cco_buf.TestCCOBufInplaceSelf.testAllgather) ... ok testAttrDeleteException (test_attributes.TestWinAttr.testAttrDeleteException) ... skipped 'mpich(<4.2.1)' testAttrNoCopy (test_attributes.TestWinAttr.testAttrNoCopy) ... ok testAttrNoPython (test_attributes.TestWinAttr.testAttrNoPython) ... ok testAttrNoPythonArray (test_attributes.TestWinAttr.testAttrNoPythonArray) ... ok testAttrNoPythonZero (test_attributes.TestWinAttr.testAttrNoPythonZero) ... ok testAllocate (test_buffer.TestBuffer.testAllocate) ... ok testAttachBufferReadonly (test_buffer.TestBuffer.testAttachBufferReadonly) ... ok testBuffering (test_buffer.TestBuffer.testBuffering) ... ok testCast (test_buffer.TestBuffer.testCast) ... ok testFromAddress (test_buffer.TestBuffer.testFromAddress) ... ok testFromBufferArrayRO (test_buffer.TestBuffer.testFromBufferArrayRO) ... ok testFromBufferArrayRW (test_buffer.TestBuffer.testFromBufferArrayRW) ... ok testFromBufferBad (test_buffer.TestBuffer.testFromBufferBad) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFromBufferBytes (test_buffer.TestBuffer.testFromBufferBytes) ... ok testNewArray (test_buffer.TestBuffer.testNewArray) ... ok testNewBad (test_buffer.TestBuffer.testNewBad) ... ok testNewBytearray (test_buffer.TestBuffer.testNewBytearray) ... ok testNewBytes (test_buffer.TestBuffer.testNewBytes) ... ok testNewEmpty (test_buffer.TestBuffer.testNewEmpty) ... ok testSequence (test_buffer.TestBuffer.testSequence) ... ok testToReadonly (test_buffer.TestBuffer.testToReadonly) ... ok testAllgather (test_cco_buf.TestCCOBufInplaceSelf.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf.TestCCOBufInplaceSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf.TestCCOBufInplaceSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf.TestCCOBufInplaceSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf.TestCCOBufInplaceSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_buf.TestCCOBufInplaceSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_buf.TestCCOBufInplaceSelf.testExscan) ... ok testExscan (test_cco_buf.TestCCOBufInplaceSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufInplaceSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_buf.TestCCOBufInplaceSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufInplaceSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufInplaceSelf.testGather) ... ok testGather (test_cco_buf.TestCCOBufInplaceSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufInplaceSelf.testReduce) ... ok testGather (test_cco_buf.TestCCOBufInplaceSelf.testGather) ... ok testReduce (test_cco_buf.TestCCOBufInplaceSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufInplaceSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufInplaceSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufInplaceSelf.testReduceScatter) ... ok testReduceScatter (test_cco_buf.TestCCOBufInplaceSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufInplaceSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_buf.TestCCOBufInplaceSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_buf.TestCCOBufInplaceSelf.testScan) ... ok testScan (test_cco_buf.TestCCOBufInplaceSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufInplaceSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_buf.TestCCOBufInplaceSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufInplaceSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufInplaceSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufInplaceSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufInplaceWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufInplaceWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufInplaceWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufInplaceWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf.TestCCOBufInplaceWorld.testAllreduce) ... ok testAllreduce (test_cco_buf.TestCCOBufInplaceWorld.testAllreduce) ... ok testAllreduce (test_cco_buf.TestCCOBufInplaceWorld.testAllreduce) ... ok testAllreduce (test_cco_buf.TestCCOBufInplaceWorld.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_buf.TestCCOBufInplaceWorld.testExscan) ... ok testExscan (test_cco_buf.TestCCOBufInplaceWorld.testExscan) ... ok testExscan (test_cco_buf.TestCCOBufInplaceWorld.testExscan) ... ok testExscan (test_cco_buf.TestCCOBufInplaceWorld.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufInplaceWorld.testGather) ... ok testGather (test_cco_buf.TestCCOBufInplaceWorld.testGather) ... ok testGather (test_cco_buf.TestCCOBufInplaceWorld.testGather) ... ok testGather (test_cco_buf.TestCCOBufInplaceWorld.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufInplaceWorld.testReduce) ... ok testReduce (test_cco_buf.TestCCOBufInplaceWorld.testReduce) ... ok testReduce (test_cco_buf.TestCCOBufInplaceWorld.testReduce) ... ok testReduce (test_cco_buf.TestCCOBufInplaceWorld.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufInplaceWorld.testReduceScatter) ... ok testReduceScatter (test_cco_buf.TestCCOBufInplaceWorld.testReduceScatter) ... ok testReduceScatter (test_cco_buf.TestCCOBufInplaceWorld.testReduceScatter) ... ok testReduceScatter (test_cco_buf.TestCCOBufInplaceWorld.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_buf.TestCCOBufInplaceWorld.testScan) ... ok testScan (test_cco_buf.TestCCOBufInplaceWorld.testScan) ... ok testScan (test_cco_buf.TestCCOBufInplaceWorld.testScan) ... ok testScan (test_cco_buf.TestCCOBufInplaceWorld.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufInplaceWorld.testScatter) ... ok testScatter (test_cco_buf.TestCCOBufInplaceWorld.testScatter) ... ok testScatter (test_cco_buf.TestCCOBufInplaceWorld.testScatter) ... ok testScatter (test_cco_buf.TestCCOBufInplaceWorld.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufSelf.testAllgather) ... ok testAllgather (test_cco_buf.TestCCOBufSelf.testAllgather) ... ok testAllgather (test_cco_buf.TestCCOBufSelf.testAllgather) ... ok testAllgather (test_cco_buf.TestCCOBufSelf.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf.TestCCOBufSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf.TestCCOBufSelf.testAllreduce) ... ok testAllreduce (test_cco_buf.TestCCOBufSelf.testAllreduce) ... ok testAllreduce (test_cco_buf.TestCCOBufSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_buf.TestCCOBufSelf.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_buf.TestCCOBufSelf.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_buf.TestCCOBufSelf.testAlltoall) ... ok testAlltoall (test_cco_buf.TestCCOBufSelf.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufSelf.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufSelf.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_buf.TestCCOBufSelf.testBcastTypeIndexed) ... ok testBarrier (test_cco_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufSelf.testBcast) ... ok testBarrier (test_cco_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufSelf.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_buf.TestCCOBufSelf.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufSelf.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_buf.TestCCOBufSelf.testExscan) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufSelf.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_buf.TestCCOBufSelf.testExscan) ... ok testExscan (test_cco_buf.TestCCOBufSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_buf.TestCCOBufSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufSelf.testGather) ... ok testReduce (test_cco_buf.TestCCOBufSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_buf.TestCCOBufSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_buf.TestCCOBufSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufSelf.testScatter) ... ok testScan (test_cco_buf.TestCCOBufSelf.testScan) ... ok testScan (test_cco_buf.TestCCOBufSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufSelfDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf.TestCCOBufSelfDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufSelfDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf.TestCCOBufSelfDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufSelf.testScatter) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testScatter (test_cco_buf.TestCCOBufSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufSelfDup.testAllgather) ... ok testAllgather (test_cco_buf.TestCCOBufSelfDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_buf.TestCCOBufSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufSelfDup.testBcast) ... ok testAllreduce (test_cco_buf.TestCCOBufSelfDup.testAllreduce) ... ok testAllreduce (test_cco_buf.TestCCOBufSelfDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_buf.TestCCOBufSelfDup.testAlltoall) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufSelfDup.testBcast) ... ok testExscan (test_cco_buf.TestCCOBufSelfDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_buf.TestCCOBufSelfDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAlltoall (test_cco_buf.TestCCOBufSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_buf.TestCCOBufSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufSelfDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufSelfDup.testReduce) ... ok testBarrier (test_cco_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufSelfDup.testBcast) ... ok testBarrier (test_cco_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufSelfDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufSelfDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_buf.TestCCOBufSelfDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufSelfDup.testReduce) ... ok testExscan (test_cco_buf.TestCCOBufSelfDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufSelfDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufSelfDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufSelfDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufSelfDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufSelfDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufSelfDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufSelfDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufSelfDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_buf.TestCCOBufSelfDup.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_buf.TestCCOBufSelfDup.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufSelfDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufSelfDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufWorld.testAllgather) ... ok testScan (test_cco_buf.TestCCOBufSelfDup.testScan) ... ok testScan (test_cco_buf.TestCCOBufSelfDup.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufSelfDup.testScatter) ... ok testScatter (test_cco_buf.TestCCOBufSelfDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf.TestCCOBufWorld.testAllreduce) ... ok testAllreduce (test_cco_buf.TestCCOBufWorld.testAllreduce) ... ok testAllreduce (test_cco_buf.TestCCOBufWorld.testAllreduce) ... ok testAllreduce (test_cco_buf.TestCCOBufWorld.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_buf.TestCCOBufWorld.testAlltoall) ... ok testAlltoall (test_cco_buf.TestCCOBufWorld.testAlltoall) ... ok testAlltoall (test_cco_buf.TestCCOBufWorld.testAlltoall) ... ok testAlltoall (test_cco_buf.TestCCOBufWorld.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_buf.TestCCOBufWorld.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufWorld.testBcast) ... ok testBarrier (test_cco_buf.TestCCOBufWorld.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufWorld.testBcast) ... ok testBarrier (test_cco_buf.TestCCOBufWorld.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufWorld.testBcast) ... ok testBarrier (test_cco_buf.TestCCOBufWorld.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufWorld.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_buf.TestCCOBufWorld.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufWorld.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufWorld.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufWorld.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_buf.TestCCOBufWorld.testExscan) ... ok ok testExscan (test_cco_buf.TestCCOBufWorld.testExscan) ... testExscan (test_cco_buf.TestCCOBufWorld.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testExscan (test_cco_buf.TestCCOBufWorld.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufWorld.testGather) ... ok testGather (test_cco_buf.TestCCOBufWorld.testGather) ... ok testGather (test_cco_buf.TestCCOBufWorld.testGather) ... ok testGather (test_cco_buf.TestCCOBufWorld.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufWorld.testReduce) ... ok testReduce (test_cco_buf.TestCCOBufWorld.testReduce) ... ok testReduce (test_cco_buf.TestCCOBufWorld.testReduce) ... ok testReduce (test_cco_buf.TestCCOBufWorld.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testReduceScatter (test_cco_buf.TestCCOBufWorld.testReduceScatter) ... ok testReduceScatter (test_cco_buf.TestCCOBufWorld.testReduceScatter) ... testReduceScatter (test_cco_buf.TestCCOBufWorld.testReduceScatter) ... testReduceScatter (test_cco_buf.TestCCOBufWorld.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufWorld.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_buf.TestCCOBufWorld.testScan) ... ok testScan (test_cco_buf.TestCCOBufWorld.testScan) ... ok testScan (test_cco_buf.TestCCOBufWorld.testScan) ... ok testScan (test_cco_buf.TestCCOBufWorld.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufWorld.testScatter) ... ok testScatter (test_cco_buf.TestCCOBufWorld.testScatter) ... ok testScatter (test_cco_buf.TestCCOBufWorld.testScatter) ... ok testScatter (test_cco_buf.TestCCOBufWorld.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf.TestCCOBufWorldDup.testAllgather) ... ok testAllgather (test_cco_buf.TestCCOBufWorldDup.testAllgather) ... ok testAllgather (test_cco_buf.TestCCOBufWorldDup.testAllgather) ... ok testAllgather (test_cco_buf.TestCCOBufWorldDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf.TestCCOBufWorldDup.testAllreduce) ... ok testAllreduce (test_cco_buf.TestCCOBufWorldDup.testAllreduce) ... ok testAllreduce (test_cco_buf.TestCCOBufWorldDup.testAllreduce) ... ok testAllreduce (test_cco_buf.TestCCOBufWorldDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_buf.TestCCOBufWorldDup.testAlltoall) ... ok testAlltoall (test_cco_buf.TestCCOBufWorldDup.testAlltoall) ... ok testAlltoall (test_cco_buf.TestCCOBufWorldDup.testAlltoall) ... ok testAlltoall (test_cco_buf.TestCCOBufWorldDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufWorldDup.testBcast) ... ok testBarrier (test_cco_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufWorldDup.testBcast) ... ok testBarrier (test_cco_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufWorldDup.testBcast) ... ok testBarrier (test_cco_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_buf.TestCCOBufWorldDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_buf.TestCCOBufWorldDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_buf.TestCCOBufWorldDup.testExscan) ... ok testExscan (test_cco_buf.TestCCOBufWorldDup.testExscan) ... ok testExscan (test_cco_buf.TestCCOBufWorldDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf.TestCCOBufWorldDup.testGather) ... ok testGather (test_cco_buf.TestCCOBufWorldDup.testGather) ... ok testGather (test_cco_buf.TestCCOBufWorldDup.testGather) ... ok testGather (test_cco_buf.TestCCOBufWorldDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf.TestCCOBufWorldDup.testReduce) ... ok testReduce (test_cco_buf.TestCCOBufWorldDup.testReduce) ... ok testReduce (test_cco_buf.TestCCOBufWorldDup.testReduce) ... ok testReduce (test_cco_buf.TestCCOBufWorldDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_buf.TestCCOBufWorldDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testReduceScatter (test_cco_buf.TestCCOBufWorldDup.testReduceScatter) ... ok testReduceScatter (test_cco_buf.TestCCOBufWorldDup.testReduceScatter) ... testReduceScatter (test_cco_buf.TestCCOBufWorldDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testScan (test_cco_buf.TestCCOBufWorldDup.testScan) ... ok testScan (test_cco_buf.TestCCOBufWorldDup.testScan) ... testScan (test_cco_buf.TestCCOBufWorldDup.testScan) ... testScan (test_cco_buf.TestCCOBufWorldDup.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf.TestCCOBufWorldDup.testScatter) ... ok testScatter (test_cco_buf.TestCCOBufWorldDup.testScatter) ... ok testScatter (test_cco_buf.TestCCOBufWorldDup.testScatter) ... ok testScatter (test_cco_buf.TestCCOBufWorldDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceLocal (test_cco_buf.TestReduceLocal.testReduceLocal) ... ok testReduceLocal (test_cco_buf.TestReduceLocal.testReduceLocal) ... ok testReduceLocal (test_cco_buf.TestReduceLocal.testReduceLocal) ... ok testReduceLocal (test_cco_buf.TestReduceLocal.testReduceLocal) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceLocalBadCount (test_cco_buf.TestReduceLocal.testReduceLocalBadCount) ... ok testReduceLocalBadCount (test_cco_buf.TestReduceLocal.testReduceLocalBadCount) ... ok testReduceLocalBadCount (test_cco_buf.TestReduceLocal.testReduceLocalBadCount) ... ok testReduceLocalBadCount (test_cco_buf.TestReduceLocal.testReduceLocalBadCount) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf_inter.TestCCOBufInter.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf_inter.TestCCOBufInter.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf_inter.TestCCOBufInter.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf_inter.TestCCOBufInter.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf_inter.TestCCOBufInter.testAllreduce) ... ok testAllreduce (test_cco_buf_inter.TestCCOBufInter.testAllreduce) ... ok testAllreduce (test_cco_buf_inter.TestCCOBufInter.testAllreduce) ... ok testAllreduce (test_cco_buf_inter.TestCCOBufInter.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_buf_inter.TestCCOBufInter.testAlltoall) ... ok testAlltoall (test_cco_buf_inter.TestCCOBufInter.testAlltoall) ... ok testAlltoall (test_cco_buf_inter.TestCCOBufInter.testAlltoall) ... ok testAlltoall (test_cco_buf_inter.TestCCOBufInter.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_buf_inter.TestCCOBufInter.testBarrier) ... ok testBcast (test_cco_buf_inter.TestCCOBufInter.testBcast) ... ok testBarrier (test_cco_buf_inter.TestCCOBufInter.testBarrier) ... ok testBcast (test_cco_buf_inter.TestCCOBufInter.testBcast) ... ok testBarrier (test_cco_buf_inter.TestCCOBufInter.testBarrier) ... ok testBcast (test_cco_buf_inter.TestCCOBufInter.testBcast) ... ok testBarrier (test_cco_buf_inter.TestCCOBufInter.testBarrier) ... ok testBcast (test_cco_buf_inter.TestCCOBufInter.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf_inter.TestCCOBufInter.testGather) ... ok testGather (test_cco_buf_inter.TestCCOBufInter.testGather) ... ok testGather (test_cco_buf_inter.TestCCOBufInter.testGather) ... ok testGather (test_cco_buf_inter.TestCCOBufInter.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf_inter.TestCCOBufInter.testReduce) ... ok testReduce (test_cco_buf_inter.TestCCOBufInter.testReduce) ... ok testReduce (test_cco_buf_inter.TestCCOBufInter.testReduce) ... ok testReduce (test_cco_buf_inter.TestCCOBufInter.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf_inter.TestCCOBufInter.testScatter) ... ok testScatter (test_cco_buf_inter.TestCCOBufInter.testScatter) ... ok testScatter (test_cco_buf_inter.TestCCOBufInter.testScatter) ... ok testScatter (test_cco_buf_inter.TestCCOBufInter.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_buf_inter.TestCCOBufInterDup.testAllgather) ... ok testAllgather (test_cco_buf_inter.TestCCOBufInterDup.testAllgather) ... ok testAllgather (test_cco_buf_inter.TestCCOBufInterDup.testAllgather) ... ok testAllgather (test_cco_buf_inter.TestCCOBufInterDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_buf_inter.TestCCOBufInterDup.testAllreduce) ... ok testAllreduce (test_cco_buf_inter.TestCCOBufInterDup.testAllreduce) ... ok testAllreduce (test_cco_buf_inter.TestCCOBufInterDup.testAllreduce) ... ok testAllreduce (test_cco_buf_inter.TestCCOBufInterDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_buf_inter.TestCCOBufInterDup.testAlltoall) ... ok testAlltoall (test_cco_buf_inter.TestCCOBufInterDup.testAlltoall) ... ok testAlltoall (test_cco_buf_inter.TestCCOBufInterDup.testAlltoall) ... ok testAlltoall (test_cco_buf_inter.TestCCOBufInterDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_buf_inter.TestCCOBufInterDup.testBarrier) ... ok testBcast (test_cco_buf_inter.TestCCOBufInterDup.testBcast) ... ok testBarrier (test_cco_buf_inter.TestCCOBufInterDup.testBarrier) ... ok testBcast (test_cco_buf_inter.TestCCOBufInterDup.testBcast) ... ok testBarrier (test_cco_buf_inter.TestCCOBufInterDup.testBarrier) ... ok testBcast (test_cco_buf_inter.TestCCOBufInterDup.testBcast) ... ok testBarrier (test_cco_buf_inter.TestCCOBufInterDup.testBarrier) ... ok testBcast (test_cco_buf_inter.TestCCOBufInterDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_buf_inter.TestCCOBufInterDup.testGather) ... ok testGather (test_cco_buf_inter.TestCCOBufInterDup.testGather) ... ok testGather (test_cco_buf_inter.TestCCOBufInterDup.testGather) ... ok testGather (test_cco_buf_inter.TestCCOBufInterDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_buf_inter.TestCCOBufInterDup.testReduce) ... ok testReduce (test_cco_buf_inter.TestCCOBufInterDup.testReduce) ... ok testReduce (test_cco_buf_inter.TestCCOBufInterDup.testReduce) ... ok testReduce (test_cco_buf_inter.TestCCOBufInterDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_buf_inter.TestCCOBufInterDup.testScatter) ... ok testScatter (test_cco_buf_inter.TestCCOBufInterDup.testScatter) ... ok testScatter (test_cco_buf_inter.TestCCOBufInterDup.testScatter) ... ok testScatter (test_cco_buf_inter.TestCCOBufInterDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testAllgather (test_cco_nb_buf.TestCCOBufInplaceSelf.testAllgather) ... ok testAllgather (test_cco_nb_buf.TestCCOBufInplaceSelf.testAllgather) ... ok testAllgather (test_cco_nb_buf.TestCCOBufInplaceSelf.testAllgather) ... testAllgather (test_cco_nb_buf.TestCCOBufInplaceSelf.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_nb_buf.TestCCOBufInplaceSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_nb_buf.TestCCOBufInplaceSelf.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufInplaceSelf.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufInplaceSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_nb_buf.TestCCOBufInplaceSelf.testExscan) ... ok testExscan (test_cco_nb_buf.TestCCOBufInplaceSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_nb_buf.TestCCOBufInplaceSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_nb_buf.TestCCOBufInplaceSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufInplaceSelf.testGather) ... ok testGather (test_cco_nb_buf.TestCCOBufInplaceSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufInplaceSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufInplaceSelf.testGather) ... ok testReduce (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduceScatter) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_nb_buf.TestCCOBufInplaceSelf.testScan) ... ok testScan (test_cco_nb_buf.TestCCOBufInplaceSelf.testScan) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_nb_buf.TestCCOBufInplaceSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_nb_buf.TestCCOBufInplaceSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_nb_buf.TestCCOBufInplaceSelf.testScatter) ... ok testScatter (test_cco_nb_buf.TestCCOBufInplaceSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_nb_buf.TestCCOBufInplaceSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_nb_buf.TestCCOBufInplaceSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_nb_buf.TestCCOBufInplaceWorld.testAllgather) ... ok testAllgather (test_cco_nb_buf.TestCCOBufInplaceWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_nb_buf.TestCCOBufInplaceWorld.testAllgather) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAllgather (test_cco_nb_buf.TestCCOBufInplaceWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_nb_buf.TestCCOBufInplaceWorld.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufInplaceWorld.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufInplaceWorld.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufInplaceWorld.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_nb_buf.TestCCOBufInplaceWorld.testExscan) ... ok testExscan (test_cco_nb_buf.TestCCOBufInplaceWorld.testExscan) ... ok testExscan (test_cco_nb_buf.TestCCOBufInplaceWorld.testExscan) ... ok testExscan (test_cco_nb_buf.TestCCOBufInplaceWorld.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufInplaceWorld.testGather) ... ok testGather (test_cco_nb_buf.TestCCOBufInplaceWorld.testGather) ... ok testGather (test_cco_nb_buf.TestCCOBufInplaceWorld.testGather) ... ok testGather (test_cco_nb_buf.TestCCOBufInplaceWorld.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduce) ... ok testReduce (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduce) ... ok testReduce (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduce) ... ok testReduce (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduceScatter) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduceScatter) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduceScatter) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_nb_buf.TestCCOBufInplaceWorld.testScan) ... ok testScan (test_cco_nb_buf.TestCCOBufInplaceWorld.testScan) ... ok testScan (test_cco_nb_buf.TestCCOBufInplaceWorld.testScan) ... ok testScan (test_cco_nb_buf.TestCCOBufInplaceWorld.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_nb_buf.TestCCOBufInplaceWorld.testScatter) ... ok testScatter (test_cco_nb_buf.TestCCOBufInplaceWorld.testScatter) ... ok testScatter (test_cco_nb_buf.TestCCOBufInplaceWorld.testScatter) ... ok testScatter (test_cco_nb_buf.TestCCOBufInplaceWorld.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_nb_buf.TestCCOBufSelf.testAllgather) ... ok testAllgather (test_cco_nb_buf.TestCCOBufSelf.testAllgather) ... ok testAllgather (test_cco_nb_buf.TestCCOBufSelf.testAllgather) ... ok testAllgather (test_cco_nb_buf.TestCCOBufSelf.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_nb_buf.TestCCOBufSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_nb_buf.TestCCOBufSelf.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufSelf.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_nb_buf.TestCCOBufSelf.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_nb_buf.TestCCOBufSelf.testAlltoall) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufSelf.testAlltoall) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufSelf.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_nb_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufSelf.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_nb_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufSelf.testBcast) ... ok testBarrier (test_cco_nb_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufSelf.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufSelf.testBcastTypeIndexed) ... ok testBarrier (test_cco_nb_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufSelf.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufSelf.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufSelf.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufSelf.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_nb_buf.TestCCOBufSelf.testExscan) ... ok testExscan (test_cco_nb_buf.TestCCOBufSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_nb_buf.TestCCOBufSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_nb_buf.TestCCOBufSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufSelf.testReduce) ... ok testGather (test_cco_nb_buf.TestCCOBufSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_nb_buf.TestCCOBufSelf.testScan) ... ok testScan (test_cco_nb_buf.TestCCOBufSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_nb_buf.TestCCOBufSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_nb_buf.TestCCOBufSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_nb_buf.TestCCOBufSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_nb_buf.TestCCOBufSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_nb_buf.TestCCOBufSelfDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_nb_buf.TestCCOBufSelfDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_nb_buf.TestCCOBufSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_nb_buf.TestCCOBufSelfDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_nb_buf.TestCCOBufSelfDup.testAllreduce) ... ok testScatter (test_cco_nb_buf.TestCCOBufSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_nb_buf.TestCCOBufSelfDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_nb_buf.TestCCOBufSelfDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_nb_buf.TestCCOBufSelfDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_nb_buf.TestCCOBufSelfDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_nb_buf.TestCCOBufSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_nb_buf.TestCCOBufSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_nb_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufSelfDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_nb_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufSelfDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_nb_buf.TestCCOBufSelfDup.testExscan) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_nb_buf.TestCCOBufSelfDup.testExscan) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_nb_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufSelfDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_nb_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufSelfDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufSelfDup.testGather) ... ok testExscan (test_cco_nb_buf.TestCCOBufSelfDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufSelfDup.testGather) ... ok testExscan (test_cco_nb_buf.TestCCOBufSelfDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufSelfDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufSelfDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufSelfDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufSelfDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufSelfDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufSelfDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufSelfDup.testReduce) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufSelfDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufSelfDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufSelfDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_nb_buf.TestCCOBufSelfDup.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_nb_buf.TestCCOBufSelfDup.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_nb_buf.TestCCOBufSelfDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_nb_buf.TestCCOBufSelfDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_nb_buf.TestCCOBufWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_nb_buf.TestCCOBufSelfDup.testScan) ... ok testScan (test_cco_nb_buf.TestCCOBufSelfDup.testScan) ... ok testAllgather (test_cco_nb_buf.TestCCOBufWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_nb_buf.TestCCOBufSelfDup.testScatter) ... ok testScatter (test_cco_nb_buf.TestCCOBufSelfDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_nb_buf.TestCCOBufWorld.testAllgather) ... ok testAllgather (test_cco_nb_buf.TestCCOBufWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_nb_buf.TestCCOBufWorld.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufWorld.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufWorld.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufWorld.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_nb_buf.TestCCOBufWorld.testAlltoall) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufWorld.testAlltoall) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufWorld.testAlltoall) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufWorld.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_nb_buf.TestCCOBufWorld.testBarrier) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_nb_buf.TestCCOBufWorld.testBarrier) ... ok ok testBarrier (test_cco_nb_buf.TestCCOBufWorld.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufWorld.testBcast) ... ok testBarrier (test_cco_nb_buf.TestCCOBufWorld.testBarrier) ... testBcast (test_cco_nb_buf.TestCCOBufWorld.testBcast) ... testBcast (test_cco_nb_buf.TestCCOBufWorld.testBcast) ... ok testBcast (test_cco_nb_buf.TestCCOBufWorld.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufWorld.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufWorld.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufWorld.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufWorld.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_nb_buf.TestCCOBufWorld.testExscan) ... ok testExscan (test_cco_nb_buf.TestCCOBufWorld.testExscan) ... ok testExscan (test_cco_nb_buf.TestCCOBufWorld.testExscan) ... ok testExscan (test_cco_nb_buf.TestCCOBufWorld.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufWorld.testGather) ... ok testGather (test_cco_nb_buf.TestCCOBufWorld.testGather) ... ok testGather (test_cco_nb_buf.TestCCOBufWorld.testGather) ... ok testGather (test_cco_nb_buf.TestCCOBufWorld.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufWorld.testReduce) ... ok testReduce (test_cco_nb_buf.TestCCOBufWorld.testReduce) ... ok testReduce (test_cco_nb_buf.TestCCOBufWorld.testReduce) ... ok testReduce (test_cco_nb_buf.TestCCOBufWorld.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufWorld.testReduceScatter) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufWorld.testReduceScatter) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufWorld.testReduceScatter) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufWorld.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufWorld.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_nb_buf.TestCCOBufWorld.testScan) ... ok testScan (test_cco_nb_buf.TestCCOBufWorld.testScan) ... ok testScan (test_cco_nb_buf.TestCCOBufWorld.testScan) ... ok testScan (test_cco_nb_buf.TestCCOBufWorld.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_nb_buf.TestCCOBufWorld.testScatter) ... ok testScatter (test_cco_nb_buf.TestCCOBufWorld.testScatter) ... ok testScatter (test_cco_nb_buf.TestCCOBufWorld.testScatter) ... ok testScatter (test_cco_nb_buf.TestCCOBufWorld.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_nb_buf.TestCCOBufWorldDup.testAllgather) ... ok testAllgather (test_cco_nb_buf.TestCCOBufWorldDup.testAllgather) ... ok testAllgather (test_cco_nb_buf.TestCCOBufWorldDup.testAllgather) ... ok testAllgather (test_cco_nb_buf.TestCCOBufWorldDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_nb_buf.TestCCOBufWorldDup.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufWorldDup.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufWorldDup.testAllreduce) ... ok testAllreduce (test_cco_nb_buf.TestCCOBufWorldDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_nb_buf.TestCCOBufWorldDup.testAlltoall) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufWorldDup.testAlltoall) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufWorldDup.testAlltoall) ... ok testAlltoall (test_cco_nb_buf.TestCCOBufWorldDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_nb_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufWorldDup.testBcast) ... ok testBarrier (test_cco_nb_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufWorldDup.testBcast) ... ok testBarrier (test_cco_nb_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufWorldDup.testBcast) ... ok testBarrier (test_cco_nb_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_nb_buf.TestCCOBufWorldDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_nb_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_nb_buf.TestCCOBufWorldDup.testExscan) ... ok testExscan (test_cco_nb_buf.TestCCOBufWorldDup.testExscan) ... ok testExscan (test_cco_nb_buf.TestCCOBufWorldDup.testExscan) ... ok testExscan (test_cco_nb_buf.TestCCOBufWorldDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_nb_buf.TestCCOBufWorldDup.testGather) ... ok testGather (test_cco_nb_buf.TestCCOBufWorldDup.testGather) ... ok testGather (test_cco_nb_buf.TestCCOBufWorldDup.testGather) ... ok testGather (test_cco_nb_buf.TestCCOBufWorldDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_nb_buf.TestCCOBufWorldDup.testReduce) ... ok testReduce (test_cco_nb_buf.TestCCOBufWorldDup.testReduce) ... ok testReduce (test_cco_nb_buf.TestCCOBufWorldDup.testReduce) ... ok testReduce (test_cco_nb_buf.TestCCOBufWorldDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_nb_buf.TestCCOBufWorldDup.testReduceScatter) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufWorldDup.testReduceScatter) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufWorldDup.testReduceScatter) ... ok testReduceScatter (test_cco_nb_buf.TestCCOBufWorldDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_nb_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_nb_buf.TestCCOBufWorldDup.testScan) ... ok testScan (test_cco_nb_buf.TestCCOBufWorldDup.testScan) ... ok testScan (test_cco_nb_buf.TestCCOBufWorldDup.testScan) ... ok testScan (test_cco_nb_buf.TestCCOBufWorldDup.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_nb_buf.TestCCOBufWorldDup.testScatter) ... ok testScatter (test_cco_nb_buf.TestCCOBufWorldDup.testScatter) ... ok testScatter (test_cco_nb_buf.TestCCOBufWorldDup.testScatter) ... ok testScatter (test_cco_nb_buf.TestCCOBufWorldDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallv) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallv) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallv) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallw) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallw) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw2 (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw2 (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... ok testAlltoallw2 (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... ok testAlltoallw2 (test_cco_nb_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallv) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallw) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallw) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallw) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw2 (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... ok testAlltoallw2 (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... ok testAlltoallw2 (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... ok testAlltoallw2 (test_cco_nb_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv) ... ok testAllgatherv (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv) ... ok testAllgatherv (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv) ... ok testAllgatherv (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv2) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv2) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv2) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv3) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv3) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecSelf.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv2) ... ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv2) ... ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv3) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecSelf.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_nb_vec.TestCCOVecSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_nb_vec.TestCCOVecSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_nb_vec.TestCCOVecSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_nb_vec.TestCCOVecSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecSelf.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecSelf.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecSelf.testAlltoallwBottom) ... Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecSelf.testAlltoallwBottom) ... ok testGatherv (test_cco_nb_vec.TestCCOVecSelf.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_nb_vec.TestCCOVecSelf.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_nb_vec.TestCCOVecSelf.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_nb_vec.TestCCOVecSelf.testGatherv) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecSelf.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_nb_vec.TestCCOVecSelf.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_nb_vec.TestCCOVecSelf.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_nb_vec.TestCCOVecSelf.testGatherv3) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecSelf.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_nb_vec.TestCCOVecSelf.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_nb_vec.TestCCOVecSelf.testGatherv3) ... ok testGatherv3 (test_cco_nb_vec.TestCCOVecSelf.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testScatterv (test_cco_nb_vec.TestCCOVecSelf.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_nb_vec.TestCCOVecSelf.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_nb_vec.TestCCOVecSelf.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_nb_vec.TestCCOVecSelf.testScatterv2) ... ok testScatterv (test_cco_nb_vec.TestCCOVecSelf.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_nb_vec.TestCCOVecSelf.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_nb_vec.TestCCOVecSelf.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_nb_vec.TestCCOVecSelf.testScatterv3) ... ok testScatterv2 (test_cco_nb_vec.TestCCOVecSelf.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_nb_vec.TestCCOVecSelf.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_nb_vec.TestCCOVecSelf.testScatterv3) ... ok testScatterv3 (test_cco_nb_vec.TestCCOVecSelf.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv2) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv) ... ok testAllgatherv (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv2) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv3) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecSelfDup.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv3) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAlltoallwBottom (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallw) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAlltoallw (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv) ... ok testGatherv (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv2) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv3) ... ok testGatherv3 (test_cco_nb_vec.TestCCOVecSelfDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testScatterv (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv) ... ok testScatterv3 (test_cco_nb_vec.TestCCOVecSelfDup.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv2) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv2) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv2) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv3) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv3) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv3) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecWorld.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv2) ... ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv2) ... ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv2) ... ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv3) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv3) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv3) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecWorld.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_nb_vec.TestCCOVecWorld.testAlltoallw) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecWorld.testAlltoallw) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecWorld.testAlltoallw) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecWorld.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecWorld.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecWorld.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecWorld.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecWorld.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_nb_vec.TestCCOVecWorld.testGatherv) ... ok testGatherv (test_cco_nb_vec.TestCCOVecWorld.testGatherv) ... ok testGatherv (test_cco_nb_vec.TestCCOVecWorld.testGatherv) ... ok testGatherv (test_cco_nb_vec.TestCCOVecWorld.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_nb_vec.TestCCOVecWorld.testGatherv2) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecWorld.testGatherv2) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecWorld.testGatherv2) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecWorld.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_nb_vec.TestCCOVecWorld.testGatherv3) ... ok testGatherv3 (test_cco_nb_vec.TestCCOVecWorld.testGatherv3) ... ok testGatherv3 (test_cco_nb_vec.TestCCOVecWorld.testGatherv3) ... ok testGatherv3 (test_cco_nb_vec.TestCCOVecWorld.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_nb_vec.TestCCOVecWorld.testScatterv) ... ok testScatterv (test_cco_nb_vec.TestCCOVecWorld.testScatterv) ... ok testScatterv (test_cco_nb_vec.TestCCOVecWorld.testScatterv) ... ok testScatterv (test_cco_nb_vec.TestCCOVecWorld.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_nb_vec.TestCCOVecWorld.testScatterv2) ... ok testScatterv2 (test_cco_nb_vec.TestCCOVecWorld.testScatterv2) ... ok testScatterv2 (test_cco_nb_vec.TestCCOVecWorld.testScatterv2) ... ok testScatterv2 (test_cco_nb_vec.TestCCOVecWorld.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_nb_vec.TestCCOVecWorld.testScatterv3) ... ok testScatterv3 (test_cco_nb_vec.TestCCOVecWorld.testScatterv3) ... ok testScatterv3 (test_cco_nb_vec.TestCCOVecWorld.testScatterv3) ... ok testScatterv3 (test_cco_nb_vec.TestCCOVecWorld.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv) ... ok testAllgatherv (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv) ... ok testAllgatherv (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv) ... ok testAllgatherv (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv2) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv2) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv2) ... ok testAllgatherv2 (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv3) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv3) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv3) ... ok testAllgatherv3 (test_cco_nb_vec.TestCCOVecWorldDup.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv) ... ok testAlltoallv (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv2) ... ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv2) ... ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv2) ... ok testAlltoallv2 (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv3) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv3) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv3) ... ok testAlltoallv3 (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallw) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallw) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallw) ... ok testAlltoallw (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_nb_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv) ... ok testGatherv (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv) ... ok testGatherv (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv) ... ok testGatherv (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv2) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv2) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv2) ... ok testGatherv2 (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv3) ... ok testGatherv3 (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv3) ... ok testGatherv3 (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv3) ... ok testGatherv3 (test_cco_nb_vec.TestCCOVecWorldDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv) ... ok testScatterv (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv) ... ok testScatterv (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv) ... ok testScatterv (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv2) ... ok testScatterv2 (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv2) ... ok testScatterv2 (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv2) ... ok testScatterv2 (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv3) ... ok testScatterv3 (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv3) ... ok testScatterv3 (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv3) ... ok testScatterv3 (test_cco_nb_vec.TestCCOVecWorldDup.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok ok ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAllgather) ... testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAllgather) ... testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAllgather) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallwBottom) ... ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallwBottom) ... ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallwBottom) ... ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallwBottom) ... ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallwBottom) ... ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallwBottom) ... ok testNeighborAlltoallwBottom (test_cco_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjSelf.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjSelf.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjSelf.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjSelf.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjSelf.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjSelf.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjSelf.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjSelf.testNeighborAlltoall) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjSelfDup.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjSelfDup.testNeighborAlltoall) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjSelfDup.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjSelfDup.testNeighborAlltoall) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjSelfDup.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjSelfDup.testNeighborAlltoall) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjSelfDup.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjSelfDup.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjWorld.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjWorld.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjWorld.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjWorld.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjWorld.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjWorld.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjWorld.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjWorld.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjWorldDup.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjWorldDup.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjWorldDup.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_ngh_obj.TestCCONghObjWorldDup.testNeighborAllgather) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjWorldDup.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjWorldDup.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjWorldDup.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_ngh_obj.TestCCONghObjWorldDup.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_obj.TestCCOObjSelf.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjSelf.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjSelf.testAlltoall) ... ok testBarrier (test_cco_obj.TestCCOObjSelf.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjSelf.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_obj.TestCCOObjSelf.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjSelf.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjSelf.testAlltoall) ... ok testBarrier (test_cco_obj.TestCCOObjSelf.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjSelf.testBcast) ... ok ok testAllgather (test_cco_obj.TestCCOObjSelf.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjSelf.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjSelf.testAlltoall) ... ok testBarrier (test_cco_obj.TestCCOObjSelf.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjSelf.testBcast) ... ok ok testAllgather (test_cco_obj.TestCCOObjSelf.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjSelf.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjSelf.testAlltoall) ... ok testBarrier (test_cco_obj.TestCCOObjSelf.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjSelf.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjSelf.testExscan) ... ok testExscan (test_cco_obj.TestCCOObjSelf.testExscan) ... testExscan (test_cco_obj.TestCCOObjSelf.testExscan) ... ok testGather (test_cco_obj.TestCCOObjSelf.testGather) ... ok testReduce (test_cco_obj.TestCCOObjSelf.testReduce) ... ok testScan (test_cco_obj.TestCCOObjSelf.testScan) ... ok testScatter (test_cco_obj.TestCCOObjSelf.testScatter) ... ok testAllgather (test_cco_obj.TestCCOObjSelfDup.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjSelfDup.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjSelfDup.testAlltoall) ... testExscan (test_cco_obj.TestCCOObjSelf.testExscan) ... ok testGather (test_cco_obj.TestCCOObjSelf.testGather) ... ok testReduce (test_cco_obj.TestCCOObjSelf.testReduce) ... ok testScan (test_cco_obj.TestCCOObjSelf.testScan) ... ok testScatter (test_cco_obj.TestCCOObjSelf.testScatter) ... ok testAllgather (test_cco_obj.TestCCOObjSelfDup.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjSelfDup.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_obj.TestCCOObjSelf.testGather) ... ok testReduce (test_cco_obj.TestCCOObjSelf.testReduce) ... ok testScan (test_cco_obj.TestCCOObjSelf.testScan) ... ok testScatter (test_cco_obj.TestCCOObjSelf.testScatter) ... ok testAllgather (test_cco_obj.TestCCOObjSelfDup.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjSelfDup.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_obj.TestCCOObjSelf.testGather) ... ok testReduce (test_cco_obj.TestCCOObjSelf.testReduce) ... ok testScan (test_cco_obj.TestCCOObjSelf.testScan) ... ok testScatter (test_cco_obj.TestCCOObjSelf.testScatter) ... ok testAllgather (test_cco_obj.TestCCOObjSelfDup.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjSelfDup.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjSelfDup.testAlltoall) ... ok testBarrier (test_cco_obj.TestCCOObjSelfDup.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjSelfDup.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjSelfDup.testExscan) ... ok testGather (test_cco_obj.TestCCOObjSelfDup.testGather) ... ok testReduce (test_cco_obj.TestCCOObjSelfDup.testReduce) ... ok testScan (test_cco_obj.TestCCOObjSelfDup.testScan) ... ok testBarrier (test_cco_obj.TestCCOObjSelfDup.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjSelfDup.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjSelfDup.testExscan) ... ok testGather (test_cco_obj.TestCCOObjSelfDup.testGather) ... ok testReduce (test_cco_obj.TestCCOObjSelfDup.testReduce) ... ok testScan (test_cco_obj.TestCCOObjSelfDup.testScan) ... ok testScatter (test_cco_obj.TestCCOObjSelfDup.testScatter) ... ok testBarrier (test_cco_obj.TestCCOObjSelfDup.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjSelfDup.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjSelfDup.testExscan) ... ok testGather (test_cco_obj.TestCCOObjSelfDup.testGather) ... ok testReduce (test_cco_obj.TestCCOObjSelfDup.testReduce) ... ok testScan (test_cco_obj.TestCCOObjSelfDup.testScan) ... ok testScatter (test_cco_obj.TestCCOObjSelfDup.testScatter) ... ok testBarrier (test_cco_obj.TestCCOObjSelfDup.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjSelfDup.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjSelfDup.testExscan) ... ok testGather (test_cco_obj.TestCCOObjSelfDup.testGather) ... ok testReduce (test_cco_obj.TestCCOObjSelfDup.testReduce) ... ok testScan (test_cco_obj.TestCCOObjSelfDup.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_obj.TestCCOObjSelfDup.testScatter) ... ok testAllgather (test_cco_obj.TestCCOObjWorld.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjWorld.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjWorld.testAlltoall) ... ok testAllgather (test_cco_obj.TestCCOObjWorld.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjWorld.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjWorld.testAlltoall) ... ok testAllgather (test_cco_obj.TestCCOObjWorld.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjWorld.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjWorld.testAlltoall) ... ok testScatter (test_cco_obj.TestCCOObjSelfDup.testScatter) ... ok testAllgather (test_cco_obj.TestCCOObjWorld.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjWorld.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjWorld.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_obj.TestCCOObjWorld.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjWorld.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjWorld.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_obj.TestCCOObjWorld.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjWorld.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjWorld.testExscan) ... ok testBarrier (test_cco_obj.TestCCOObjWorld.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjWorld.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjWorld.testExscan) ... ok testBarrier (test_cco_obj.TestCCOObjWorld.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjWorld.testBcast) ... ok testExscan (test_cco_obj.TestCCOObjWorld.testExscan) ... ok testGather (test_cco_obj.TestCCOObjWorld.testGather) ... ok testGather (test_cco_obj.TestCCOObjWorld.testGather) ... ok testGather (test_cco_obj.TestCCOObjWorld.testGather) ... ok testGather (test_cco_obj.TestCCOObjWorld.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_obj.TestCCOObjWorld.testReduce) ... ok testScan (test_cco_obj.TestCCOObjWorld.testScan) ... ok testScatter (test_cco_obj.TestCCOObjWorld.testScatter) ... ok testReduce (test_cco_obj.TestCCOObjWorld.testReduce) ... ok testScan (test_cco_obj.TestCCOObjWorld.testScan) ... ok testScatter (test_cco_obj.TestCCOObjWorld.testScatter) ... ok testReduce (test_cco_obj.TestCCOObjWorld.testReduce) ... ok testScan (test_cco_obj.TestCCOObjWorld.testScan) ... ok testScatter (test_cco_obj.TestCCOObjWorld.testScatter) ... ok testReduce (test_cco_obj.TestCCOObjWorld.testReduce) ... ok testScan (test_cco_obj.TestCCOObjWorld.testScan) ... ok testScatter (test_cco_obj.TestCCOObjWorld.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_obj.TestCCOObjWorldDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_obj.TestCCOObjWorldDup.testAllgather) ... ok testAllgather (test_cco_obj.TestCCOObjWorldDup.testAllgather) ... ok testAllgather (test_cco_obj.TestCCOObjWorldDup.testAllgather) ... ok testAllreduce (test_cco_obj.TestCCOObjWorldDup.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjWorldDup.testAlltoall) ... ok testAllreduce (test_cco_obj.TestCCOObjWorldDup.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjWorldDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_obj.TestCCOObjWorldDup.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjWorldDup.testAlltoall) ... ok testAllreduce (test_cco_obj.TestCCOObjWorldDup.testAllreduce) ... ok testAlltoall (test_cco_obj.TestCCOObjWorldDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_obj.TestCCOObjWorldDup.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjWorldDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_obj.TestCCOObjWorldDup.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjWorldDup.testBcast) ... ok testBarrier (test_cco_obj.TestCCOObjWorldDup.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjWorldDup.testBcast) ... ok testBarrier (test_cco_obj.TestCCOObjWorldDup.testBarrier) ... ok testBcast (test_cco_obj.TestCCOObjWorldDup.testBcast) ... ok ok testExscan (test_cco_obj.TestCCOObjWorldDup.testExscan) ... ok testGather (test_cco_obj.TestCCOObjWorldDup.testGather) ... ok testExscan (test_cco_obj.TestCCOObjWorldDup.testExscan) ... ok testGather (test_cco_obj.TestCCOObjWorldDup.testGather) ... testExscan (test_cco_obj.TestCCOObjWorldDup.testExscan) ... ok testGather (test_cco_obj.TestCCOObjWorldDup.testGather) ... ok testExscan (test_cco_obj.TestCCOObjWorldDup.testExscan) ... ok testGather (test_cco_obj.TestCCOObjWorldDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_obj.TestCCOObjWorldDup.testReduce) ... ok testReduce (test_cco_obj.TestCCOObjWorldDup.testReduce) ... ok testReduce (test_cco_obj.TestCCOObjWorldDup.testReduce) ... ok testReduce (test_cco_obj.TestCCOObjWorldDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_obj.TestCCOObjWorldDup.testScan) ... ok testScatter (test_cco_obj.TestCCOObjWorldDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_obj.TestCCOObjWorldDup.testScan) ... ok testScatter (test_cco_obj.TestCCOObjWorldDup.testScatter) ... ok testScan (test_cco_obj.TestCCOObjWorldDup.testScan) ... ok testScatter (test_cco_obj.TestCCOObjWorldDup.testScatter) ... ok testScan (test_cco_obj.TestCCOObjWorldDup.testScan) ... ok testScatter (test_cco_obj.TestCCOObjWorldDup.testScatter) ... ok testAllgather (test_cco_obj_inter.TestCCOObjInter.testAllgather) ... ok testAllgather (test_cco_obj_inter.TestCCOObjInter.testAllgather) ... ok testAllgather (test_cco_obj_inter.TestCCOObjInter.testAllgather) ... ok testAllgather (test_cco_obj_inter.TestCCOObjInter.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_obj_inter.TestCCOObjInter.testAllreduce) ... ok testAlltoall (test_cco_obj_inter.TestCCOObjInter.testAlltoall) ... ok testBarrier (test_cco_obj_inter.TestCCOObjInter.testBarrier) ... ok ok testAllreduce (test_cco_obj_inter.TestCCOObjInter.testAllreduce) ... ok testAlltoall (test_cco_obj_inter.TestCCOObjInter.testAlltoall) ... ok testBarrier (test_cco_obj_inter.TestCCOObjInter.testBarrier) ... ok testAllreduce (test_cco_obj_inter.TestCCOObjInter.testAllreduce) ... ok testAlltoall (test_cco_obj_inter.TestCCOObjInter.testAlltoall) ... ok testBarrier (test_cco_obj_inter.TestCCOObjInter.testBarrier) ... ok testBcast (test_cco_obj_inter.TestCCOObjInter.testBcast) ... ok testAllreduce (test_cco_obj_inter.TestCCOObjInter.testAllreduce) ... ok testAlltoall (test_cco_obj_inter.TestCCOObjInter.testAlltoall) ... ok testBarrier (test_cco_obj_inter.TestCCOObjInter.testBarrier) ... ok testBcast (test_cco_obj_inter.TestCCOObjInter.testBcast) ... testBcast (test_cco_obj_inter.TestCCOObjInter.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcast (test_cco_obj_inter.TestCCOObjInter.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_obj_inter.TestCCOObjInter.testGather) ... ok testGather (test_cco_obj_inter.TestCCOObjInter.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_obj_inter.TestCCOObjInter.testGather) ... ok testGather (test_cco_obj_inter.TestCCOObjInter.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_obj_inter.TestCCOObjInter.testReduce) ... ok testReduce (test_cco_obj_inter.TestCCOObjInter.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testReduce (test_cco_obj_inter.TestCCOObjInter.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testReduce (test_cco_obj_inter.TestCCOObjInter.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_obj_inter.TestCCOObjInter.testScatter) ... ok testScatter (test_cco_obj_inter.TestCCOObjInter.testScatter) ... ok testScatter (test_cco_obj_inter.TestCCOObjInter.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_obj_inter.TestCCOObjInter.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_obj_inter.TestCCOObjInterDup.testAllgather) ... ok testAllgather (test_cco_obj_inter.TestCCOObjInterDup.testAllgather) ... ok testAllgather (test_cco_obj_inter.TestCCOObjInterDup.testAllgather) ... ok testAllgather (test_cco_obj_inter.TestCCOObjInterDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_obj_inter.TestCCOObjInterDup.testAllreduce) ... ok testAlltoall (test_cco_obj_inter.TestCCOObjInterDup.testAlltoall) ... ok ok testAllreduce (test_cco_obj_inter.TestCCOObjInterDup.testAllreduce) ... ok testAlltoall (test_cco_obj_inter.TestCCOObjInterDup.testAlltoall) ... ok testBarrier (test_cco_obj_inter.TestCCOObjInterDup.testBarrier) ... ok testAllreduce (test_cco_obj_inter.TestCCOObjInterDup.testAllreduce) ... ok testAlltoall (test_cco_obj_inter.TestCCOObjInterDup.testAlltoall) ... ok ok testAllreduce (test_cco_obj_inter.TestCCOObjInterDup.testAllreduce) ... ok testAlltoall (test_cco_obj_inter.TestCCOObjInterDup.testAlltoall) ... ok testBarrier (test_cco_obj_inter.TestCCOObjInterDup.testBarrier) ... testBarrier (test_cco_obj_inter.TestCCOObjInterDup.testBarrier) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testBarrier (test_cco_obj_inter.TestCCOObjInterDup.testBarrier) ... ok testBcast (test_cco_obj_inter.TestCCOObjInterDup.testBcast) ... ok testBcast (test_cco_obj_inter.TestCCOObjInterDup.testBcast) ... ok ok testBcast (test_cco_obj_inter.TestCCOObjInterDup.testBcast) ... ok testGather (test_cco_obj_inter.TestCCOObjInterDup.testGather) ... ok testBcast (test_cco_obj_inter.TestCCOObjInterDup.testBcast) ... ok testGather (test_cco_obj_inter.TestCCOObjInterDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGather (test_cco_obj_inter.TestCCOObjInterDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_obj_inter.TestCCOObjInterDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testReduce (test_cco_obj_inter.TestCCOObjInterDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_obj_inter.TestCCOObjInterDup.testReduce) ... ok testScatter (test_cco_obj_inter.TestCCOObjInterDup.testScatter) ... ok testScatter (test_cco_obj_inter.TestCCOObjInterDup.testScatter) ... ok testReduce (test_cco_obj_inter.TestCCOObjInterDup.testReduce) ... ok testScatter (test_cco_obj_inter.TestCCOObjInterDup.testScatter) ... ok testReduce (test_cco_obj_inter.TestCCOObjInterDup.testReduce) ... ok testScatter (test_cco_obj_inter.TestCCOObjInterDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_obj_inter.TestCCOObjInterDupDup.testAllgather) ... ok ok testAllgather (test_cco_obj_inter.TestCCOObjInterDupDup.testAllgather) ... ok testAllreduce (test_cco_obj_inter.TestCCOObjInterDupDup.testAllreduce) ... ok testAllgather (test_cco_obj_inter.TestCCOObjInterDupDup.testAllgather) ... ok testAllreduce (test_cco_obj_inter.TestCCOObjInterDupDup.testAllreduce) ... ok testAllgather (test_cco_obj_inter.TestCCOObjInterDupDup.testAllgather) ... ok testAllreduce (test_cco_obj_inter.TestCCOObjInterDupDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAllreduce (test_cco_obj_inter.TestCCOObjInterDupDup.testAllreduce) ... ok testAlltoall (test_cco_obj_inter.TestCCOObjInterDupDup.testAlltoall) ... ok testBarrier (test_cco_obj_inter.TestCCOObjInterDupDup.testBarrier) ... ok testBcast (test_cco_obj_inter.TestCCOObjInterDupDup.testBcast) ... ok testAlltoall (test_cco_obj_inter.TestCCOObjInterDupDup.testAlltoall) ... ok testBarrier (test_cco_obj_inter.TestCCOObjInterDupDup.testBarrier) ... ok testBcast (test_cco_obj_inter.TestCCOObjInterDupDup.testBcast) ... ok testAlltoall (test_cco_obj_inter.TestCCOObjInterDupDup.testAlltoall) ... ok testBarrier (test_cco_obj_inter.TestCCOObjInterDupDup.testBarrier) ... ok testBcast (test_cco_obj_inter.TestCCOObjInterDupDup.testBcast) ... ok testAlltoall (test_cco_obj_inter.TestCCOObjInterDupDup.testAlltoall) ... ok testBarrier (test_cco_obj_inter.TestCCOObjInterDupDup.testBarrier) ... ok testBcast (test_cco_obj_inter.TestCCOObjInterDupDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_obj_inter.TestCCOObjInterDupDup.testGather) ... ok testGather (test_cco_obj_inter.TestCCOObjInterDupDup.testGather) ... ok testGather (test_cco_obj_inter.TestCCOObjInterDupDup.testGather) ... ok testGather (test_cco_obj_inter.TestCCOObjInterDupDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_obj_inter.TestCCOObjInterDupDup.testReduce) ... ok testReduce (test_cco_obj_inter.TestCCOObjInterDupDup.testReduce) ... ok testReduce (test_cco_obj_inter.TestCCOObjInterDupDup.testReduce) ... ok testReduce (test_cco_obj_inter.TestCCOObjInterDupDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_obj_inter.TestCCOObjInterDupDup.testScatter) ... ok testScatter (test_cco_obj_inter.TestCCOObjInterDupDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_obj_inter.TestCCOObjInterDupDup.testScatter) ... ok testScatter (test_cco_obj_inter.TestCCOObjInterDupDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufInplaceSelf.testAllgather) ... ok testAllgather (test_cco_pr_buf.TestCCOBufInplaceSelf.testAllgather) ... ok testAllgather (test_cco_pr_buf.TestCCOBufInplaceSelf.testAllgather) ... ok testAllgather (test_cco_pr_buf.TestCCOBufInplaceSelf.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_pr_buf.TestCCOBufInplaceSelf.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufInplaceSelf.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufInplaceSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_pr_buf.TestCCOBufInplaceSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_pr_buf.TestCCOBufInplaceSelf.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufInplaceSelf.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufInplaceSelf.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufInplaceSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_pr_buf.TestCCOBufInplaceSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_pr_buf.TestCCOBufInplaceSelf.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufInplaceSelf.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufInplaceSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... testReduceScatterBlock (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufInplaceSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_pr_buf.TestCCOBufInplaceSelf.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufInplaceSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_pr_buf.TestCCOBufInplaceSelf.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufInplaceSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_pr_buf.TestCCOBufInplaceSelf.testScatter) ... ok testScatter (test_cco_pr_buf.TestCCOBufInplaceSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_pr_buf.TestCCOBufInplaceSelf.testScatter) ... ok testScatter (test_cco_pr_buf.TestCCOBufInplaceSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufInplaceWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufInplaceWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufInplaceWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufInplaceWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_pr_buf.TestCCOBufInplaceWorld.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufInplaceWorld.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufInplaceWorld.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufInplaceWorld.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_pr_buf.TestCCOBufInplaceWorld.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufInplaceWorld.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufInplaceWorld.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufInplaceWorld.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_pr_buf.TestCCOBufInplaceWorld.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufInplaceWorld.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufInplaceWorld.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufInplaceWorld.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufInplaceWorld.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_pr_buf.TestCCOBufInplaceWorld.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufInplaceWorld.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufInplaceWorld.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufInplaceWorld.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_pr_buf.TestCCOBufInplaceWorld.testScatter) ... ok testScatter (test_cco_pr_buf.TestCCOBufInplaceWorld.testScatter) ... ok testScatter (test_cco_pr_buf.TestCCOBufInplaceWorld.testScatter) ... ok testScatter (test_cco_pr_buf.TestCCOBufInplaceWorld.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufSelf.testAllgather) ... ok testAllgather (test_cco_pr_buf.TestCCOBufSelf.testAllgather) ... ok testAllgather (test_cco_pr_buf.TestCCOBufSelf.testAllgather) ... ok testAllgather (test_cco_pr_buf.TestCCOBufSelf.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_pr_buf.TestCCOBufSelf.testAllreduce) ... testAllreduce (test_cco_pr_buf.TestCCOBufSelf.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufSelf.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufSelf.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_pr_buf.TestCCOBufSelf.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_pr_buf.TestCCOBufSelf.testAlltoall) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufSelf.testAlltoall) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufSelf.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_pr_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufSelf.testBcast) ... ok testBarrier (test_cco_pr_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufSelf.testBcast) ... ok testBarrier (test_cco_pr_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufSelf.testBcast) ... ok testBarrier (test_cco_pr_buf.TestCCOBufSelf.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufSelf.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufSelf.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufSelf.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufSelf.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufSelf.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_pr_buf.TestCCOBufSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_pr_buf.TestCCOBufSelf.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufSelf.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufSelf.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_pr_buf.TestCCOBufSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_pr_buf.TestCCOBufSelf.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufSelf.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufSelf.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_pr_buf.TestCCOBufSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_pr_buf.TestCCOBufSelf.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufSelf.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufSelf.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_pr_buf.TestCCOBufSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_pr_buf.TestCCOBufSelf.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufSelf.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufSelf.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufSelf.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufSelf.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufSelf.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_pr_buf.TestCCOBufSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_pr_buf.TestCCOBufSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_pr_buf.TestCCOBufSelf.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufSelf.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_pr_buf.TestCCOBufSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_pr_buf.TestCCOBufSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_pr_buf.TestCCOBufSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_pr_buf.TestCCOBufSelf.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufSelfDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufSelfDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufSelfDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufSelfDup.testAllgather) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufSelfDup.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufSelfDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_pr_buf.TestCCOBufSelfDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_pr_buf.TestCCOBufSelfDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_pr_buf.TestCCOBufSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_pr_buf.TestCCOBufSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_pr_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufSelfDup.testBcast) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_pr_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufSelfDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_pr_buf.TestCCOBufSelfDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_pr_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufSelfDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_pr_buf.TestCCOBufSelfDup.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufSelfDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... ok testExscan (test_cco_pr_buf.TestCCOBufSelfDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_pr_buf.TestCCOBufSelfDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufSelfDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_pr_buf.TestCCOBufSelfDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_pr_buf.TestCCOBufSelfDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_pr_buf.TestCCOBufSelfDup.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufSelfDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_pr_buf.TestCCOBufSelfDup.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufSelfDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_pr_buf.TestCCOBufSelfDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_pr_buf.TestCCOBufSelfDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_pr_buf.TestCCOBufSelfDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_pr_buf.TestCCOBufSelfDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_pr_buf.TestCCOBufSelfDup.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufSelfDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_pr_buf.TestCCOBufSelfDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_pr_buf.TestCCOBufSelfDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_pr_buf.TestCCOBufSelfDup.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufSelfDup.testReduceScatterBlock) ... ok testScan (test_cco_pr_buf.TestCCOBufSelfDup.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_pr_buf.TestCCOBufSelfDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_pr_buf.TestCCOBufSelfDup.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufSelfDup.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_pr_buf.TestCCOBufSelfDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_pr_buf.TestCCOBufSelfDup.testScatter) ... ok testScatter (test_cco_pr_buf.TestCCOBufSelfDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufWorld.testAllgather) ... ok testAllgather (test_cco_pr_buf.TestCCOBufWorld.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_pr_buf.TestCCOBufWorld.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufWorld.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufWorld.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufWorld.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_pr_buf.TestCCOBufWorld.testAlltoall) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufWorld.testAlltoall) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufWorld.testAlltoall) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufWorld.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_pr_buf.TestCCOBufWorld.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufWorld.testBcast) ... ok testBarrier (test_cco_pr_buf.TestCCOBufWorld.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufWorld.testBcast) ... ok testBarrier (test_cco_pr_buf.TestCCOBufWorld.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufWorld.testBcast) ... ok testBarrier (test_cco_pr_buf.TestCCOBufWorld.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufWorld.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufWorld.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufWorld.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufWorld.testBcastTypeIndexed) ... testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufWorld.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_pr_buf.TestCCOBufWorld.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufWorld.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufWorld.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufWorld.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_pr_buf.TestCCOBufWorld.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufWorld.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufWorld.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufWorld.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_pr_buf.TestCCOBufWorld.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufWorld.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufWorld.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufWorld.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_pr_buf.TestCCOBufWorld.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufWorld.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufWorld.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufWorld.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufWorld.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufWorld.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_pr_buf.TestCCOBufWorld.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufWorld.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufWorld.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufWorld.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_pr_buf.TestCCOBufWorld.testScatter) ... ok testScatter (test_cco_pr_buf.TestCCOBufWorld.testScatter) ... ok testScatter (test_cco_pr_buf.TestCCOBufWorld.testScatter) ... ok testScatter (test_cco_pr_buf.TestCCOBufWorld.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgather (test_cco_pr_buf.TestCCOBufWorldDup.testAllgather) ... ok testAllgather (test_cco_pr_buf.TestCCOBufWorldDup.testAllgather) ... ok testAllgather (test_cco_pr_buf.TestCCOBufWorldDup.testAllgather) ... ok testAllgather (test_cco_pr_buf.TestCCOBufWorldDup.testAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllreduce (test_cco_pr_buf.TestCCOBufWorldDup.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufWorldDup.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufWorldDup.testAllreduce) ... ok testAllreduce (test_cco_pr_buf.TestCCOBufWorldDup.testAllreduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoall (test_cco_pr_buf.TestCCOBufWorldDup.testAlltoall) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufWorldDup.testAlltoall) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufWorldDup.testAlltoall) ... ok testAlltoall (test_cco_pr_buf.TestCCOBufWorldDup.testAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBarrier (test_cco_pr_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufWorldDup.testBcast) ... ok testBarrier (test_cco_pr_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufWorldDup.testBcast) ... ok testBarrier (test_cco_pr_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufWorldDup.testBcast) ... ok testBarrier (test_cco_pr_buf.TestCCOBufWorldDup.testBarrier) ... ok testBcast (test_cco_pr_buf.TestCCOBufWorldDup.testBcast) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... ok testBcastTypeIndexed (test_cco_pr_buf.TestCCOBufWorldDup.testBcastTypeIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExscan (test_cco_pr_buf.TestCCOBufWorldDup.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufWorldDup.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufWorldDup.testExscan) ... ok testExscan (test_cco_pr_buf.TestCCOBufWorldDup.testExscan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGather (test_cco_pr_buf.TestCCOBufWorldDup.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufWorldDup.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufWorldDup.testGather) ... ok testGather (test_cco_pr_buf.TestCCOBufWorldDup.testGather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduce (test_cco_pr_buf.TestCCOBufWorldDup.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufWorldDup.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufWorldDup.testReduce) ... ok testReduce (test_cco_pr_buf.TestCCOBufWorldDup.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatter (test_cco_pr_buf.TestCCOBufWorldDup.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufWorldDup.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufWorldDup.testReduceScatter) ... ok testReduceScatter (test_cco_pr_buf.TestCCOBufWorldDup.testReduceScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... ok testReduceScatterBlock (test_cco_pr_buf.TestCCOBufWorldDup.testReduceScatterBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScan (test_cco_pr_buf.TestCCOBufWorldDup.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufWorldDup.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufWorldDup.testScan) ... ok testScan (test_cco_pr_buf.TestCCOBufWorldDup.testScan) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatter (test_cco_pr_buf.TestCCOBufWorldDup.testScatter) ... ok testScatter (test_cco_pr_buf.TestCCOBufWorldDup.testScatter) ... ok testScatter (test_cco_pr_buf.TestCCOBufWorldDup.testScatter) ... ok testScatter (test_cco_pr_buf.TestCCOBufWorldDup.testScatter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufSelf.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoall) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallw) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufSelfDup.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufWorld.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAllgather) ... ok testNeighborAllgather (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAllgather) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoall) ... ok testNeighborAlltoall (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallw) ... ok testNeighborAlltoallw (test_cco_pr_ngh_buf.TestCCONghBufWorldDup.testNeighborAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallv) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallv) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallv) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallw) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallw) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallw) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw2 (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... ok testAlltoallw2 (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... ok testAlltoallw2 (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... ok testAlltoallw2 (test_cco_pr_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallv) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallv) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallw) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallw) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallw) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw2 (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... ok testAlltoallw2 (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... ok testAlltoallw2 (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... ok testAlltoallw2 (test_cco_pr_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv) ... ok testAllgatherv (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv) ... ok testAllgatherv (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv) ... ok testAllgatherv (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv2) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv2) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv2) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv3) ... ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv3) ... ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecSelf.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv) ... testAlltoallv (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv2) ... ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv2) ... ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv3) ... ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecSelf.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_pr_vec.TestCCOVecSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_pr_vec.TestCCOVecSelf.testAlltoallw) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAlltoallw (test_cco_pr_vec.TestCCOVecSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_pr_vec.TestCCOVecSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecSelf.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecSelf.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecSelf.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecSelf.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_pr_vec.TestCCOVecSelf.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_pr_vec.TestCCOVecSelf.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_pr_vec.TestCCOVecSelf.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_pr_vec.TestCCOVecSelf.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_pr_vec.TestCCOVecSelf.testGatherv2) ... ok testGatherv2 (test_cco_pr_vec.TestCCOVecSelf.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_pr_vec.TestCCOVecSelf.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_pr_vec.TestCCOVecSelf.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_pr_vec.TestCCOVecSelf.testGatherv3) ... ok testGatherv3 (test_cco_pr_vec.TestCCOVecSelf.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_pr_vec.TestCCOVecSelf.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_pr_vec.TestCCOVecSelf.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_pr_vec.TestCCOVecSelf.testScatterv) ... ok testScatterv (test_cco_pr_vec.TestCCOVecSelf.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_pr_vec.TestCCOVecSelf.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_pr_vec.TestCCOVecSelf.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_pr_vec.TestCCOVecSelf.testScatterv) ... ok testScatterv2 (test_cco_pr_vec.TestCCOVecSelf.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_pr_vec.TestCCOVecSelf.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_pr_vec.TestCCOVecSelf.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_pr_vec.TestCCOVecSelf.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_pr_vec.TestCCOVecSelf.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_pr_vec.TestCCOVecSelf.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_pr_vec.TestCCOVecSelf.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv2) ... ok testAllgatherv (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv3) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecSelfDup.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv) ... ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv2) ... ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_pr_vec.TestCCOVecSelfDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv3) ... ok testScatterv (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv2) ... ok testScatterv2 (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv3) ... ok testScatterv3 (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_pr_vec.TestCCOVecSelfDup.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAllgatherv (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv2) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv2) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv2) ... testAllgatherv2 (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv3) ... ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv3) ... ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv3) ... ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecWorld.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv2) ... ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv2) ... ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv2) ... ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv3) ... ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv3) ... ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv3) ... ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecWorld.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_pr_vec.TestCCOVecWorld.testAlltoallw) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecWorld.testAlltoallw) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecWorld.testAlltoallw) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecWorld.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecWorld.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecWorld.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecWorld.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecWorld.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_pr_vec.TestCCOVecWorld.testGatherv) ... ok testGatherv (test_cco_pr_vec.TestCCOVecWorld.testGatherv) ... ok testGatherv (test_cco_pr_vec.TestCCOVecWorld.testGatherv) ... ok testGatherv (test_cco_pr_vec.TestCCOVecWorld.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_pr_vec.TestCCOVecWorld.testGatherv2) ... ok testGatherv2 (test_cco_pr_vec.TestCCOVecWorld.testGatherv2) ... ok testGatherv2 (test_cco_pr_vec.TestCCOVecWorld.testGatherv2) ... ok testGatherv2 (test_cco_pr_vec.TestCCOVecWorld.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_pr_vec.TestCCOVecWorld.testGatherv3) ... ok testGatherv3 (test_cco_pr_vec.TestCCOVecWorld.testGatherv3) ... ok testGatherv3 (test_cco_pr_vec.TestCCOVecWorld.testGatherv3) ... ok testGatherv3 (test_cco_pr_vec.TestCCOVecWorld.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_pr_vec.TestCCOVecWorld.testScatterv) ... ok testScatterv (test_cco_pr_vec.TestCCOVecWorld.testScatterv) ... ok testScatterv (test_cco_pr_vec.TestCCOVecWorld.testScatterv) ... ok testScatterv (test_cco_pr_vec.TestCCOVecWorld.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_pr_vec.TestCCOVecWorld.testScatterv2) ... ok testScatterv2 (test_cco_pr_vec.TestCCOVecWorld.testScatterv2) ... ok testScatterv2 (test_cco_pr_vec.TestCCOVecWorld.testScatterv2) ... ok testScatterv2 (test_cco_pr_vec.TestCCOVecWorld.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_pr_vec.TestCCOVecWorld.testScatterv3) ... ok testScatterv3 (test_cco_pr_vec.TestCCOVecWorld.testScatterv3) ... ok testScatterv3 (test_cco_pr_vec.TestCCOVecWorld.testScatterv3) ... ok testScatterv3 (test_cco_pr_vec.TestCCOVecWorld.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv) ... ok testAllgatherv (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv) ... ok testAllgatherv (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv) ... ok testAllgatherv (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv2) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv2) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv2) ... ok testAllgatherv2 (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv3) ... ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv3) ... ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv3) ... ok testAllgatherv3 (test_cco_pr_vec.TestCCOVecWorldDup.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv) ... ok testAlltoallv (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv2) ... ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv2) ... ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv2) ... ok testAlltoallv2 (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv3) ... ok testAlltoallv3 (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv3) ... testAlltoallv3 (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallw) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallw) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallw) ... ok testAlltoallw (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_pr_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv) ... ok testGatherv (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv) ... ok testGatherv (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv) ... ok testGatherv (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv2) ... ok testGatherv2 (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv2) ... ok testGatherv2 (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv2) ... ok testGatherv2 (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv3) ... ok testGatherv3 (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv3) ... ok testGatherv3 (test_cco_pr_vec.TestCCOVecWorldDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv) ... ok testScatterv (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv) ... ok testScatterv (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv2) ... ok testScatterv2 (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv2) ... ok testScatterv2 (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv2) ... ok testScatterv2 (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv3) ... ok testScatterv3 (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv3) ... ok testScatterv3 (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv3) ... ok testScatterv3 (test_cco_pr_vec.TestCCOVecWorldDup.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallv) ... ok testAlltoallv (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallv) ... ok testAlltoallv (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallv) ... ok testAlltoallv (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallw) ... ok testAlltoallw (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallw) ... ok testAlltoallw (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallw) ... ok testAlltoallw (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw2 (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... ok testAlltoallw2 (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... ok testAlltoallw2 (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... ok testAlltoallw2 (test_cco_vec.TestCCOVecInplaceSelf.testAlltoallw2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallv) ... ok testAlltoallv (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallw) ... ok testAlltoallw (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallw) ... ok testAlltoallw (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallw) ... ok testAlltoallw (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw2 (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... ok testAlltoallw2 (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... ok testAlltoallw2 (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... ok testAlltoallw2 (test_cco_vec.TestCCOVecInplaceWorld.testAlltoallw2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_vec.TestCCOVecSelf.testAllgatherv) ... ok testAllgatherv (test_cco_vec.TestCCOVecSelf.testAllgatherv) ... ok testAllgatherv (test_cco_vec.TestCCOVecSelf.testAllgatherv) ... ok testAllgatherv (test_cco_vec.TestCCOVecSelf.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_vec.TestCCOVecSelf.testAllgatherv2) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecSelf.testAllgatherv2) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecSelf.testAllgatherv2) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecSelf.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_vec.TestCCOVecSelf.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_vec.TestCCOVecSelf.testAllgatherv3) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecSelf.testAllgatherv3) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecSelf.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec.TestCCOVecSelf.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec.TestCCOVecSelf.testAlltoallv) ... ok testAlltoallv (test_cco_vec.TestCCOVecSelf.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec.TestCCOVecSelf.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_vec.TestCCOVecSelf.testAlltoallv2) ... ok testAlltoallv2 (test_cco_vec.TestCCOVecSelf.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_vec.TestCCOVecSelf.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_vec.TestCCOVecSelf.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_vec.TestCCOVecSelf.testAlltoallv3) ... ok testAlltoallv3 (test_cco_vec.TestCCOVecSelf.testAlltoallv3) ... ok testAlltoallv3 (test_cco_vec.TestCCOVecSelf.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_vec.TestCCOVecSelf.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_vec.TestCCOVecSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_vec.TestCCOVecSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_vec.TestCCOVecSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_vec.TestCCOVecSelf.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_vec.TestCCOVecSelf.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_vec.TestCCOVecSelf.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_vec.TestCCOVecSelf.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_vec.TestCCOVecSelf.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_vec.TestCCOVecSelf.testGatherv) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_vec.TestCCOVecSelf.testAlltoallwBottom) ... testGatherv (test_cco_vec.TestCCOVecSelf.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_vec.TestCCOVecSelf.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_vec.TestCCOVecSelf.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_vec.TestCCOVecSelf.testGatherv) ... ok testGatherv2 (test_cco_vec.TestCCOVecSelf.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_vec.TestCCOVecSelf.testGatherv3) ... ok testGatherv3 (test_cco_vec.TestCCOVecSelf.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_vec.TestCCOVecSelf.testGatherv2) ... ok testGatherv3 (test_cco_vec.TestCCOVecSelf.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_vec.TestCCOVecSelf.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec.TestCCOVecSelf.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec.TestCCOVecSelf.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_vec.TestCCOVecSelf.testScatterv2) ... ok testScatterv (test_cco_vec.TestCCOVecSelf.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_vec.TestCCOVecSelf.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_vec.TestCCOVecSelf.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_vec.TestCCOVecSelf.testScatterv2) ... ok testScatterv3 (test_cco_vec.TestCCOVecSelf.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec.TestCCOVecSelf.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_vec.TestCCOVecSelf.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_vec.TestCCOVecSelf.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_vec.TestCCOVecSelf.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_vec.TestCCOVecSelfDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_vec.TestCCOVecSelfDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_vec.TestCCOVecSelfDup.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testAllgatherv2 (test_cco_vec.TestCCOVecSelfDup.testAllgatherv2) ... testAllgatherv (test_cco_vec.TestCCOVecSelfDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_vec.TestCCOVecSelfDup.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_vec.TestCCOVecSelfDup.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_vec.TestCCOVecSelfDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_vec.TestCCOVecSelfDup.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_vec.TestCCOVecSelfDup.testAllgatherv2) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecSelfDup.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec.TestCCOVecSelfDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_vec.TestCCOVecSelfDup.testAllgatherv3) ... ok testAlltoallv (test_cco_vec.TestCCOVecSelfDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_vec.TestCCOVecSelfDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_vec.TestCCOVecSelfDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_vec.TestCCOVecSelfDup.testAlltoallv3) ... ok testAlltoallv (test_cco_vec.TestCCOVecSelfDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_vec.TestCCOVecSelfDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_vec.TestCCOVecSelfDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec.TestCCOVecSelfDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_vec.TestCCOVecSelfDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_vec.TestCCOVecSelfDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_vec.TestCCOVecSelfDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_vec.TestCCOVecSelfDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_vec.TestCCOVecSelfDup.testAlltoallv3) ... ok testAlltoallwBottom (test_cco_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_vec.TestCCOVecSelfDup.testGatherv) ... ok testAlltoallw (test_cco_vec.TestCCOVecSelfDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_vec.TestCCOVecSelfDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_vec.TestCCOVecSelfDup.testGatherv2) ... ok testAlltoallwBottom (test_cco_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_vec.TestCCOVecSelfDup.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_vec.TestCCOVecSelfDup.testGatherv3) ... ok testGatherv (test_cco_vec.TestCCOVecSelfDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testGatherv3 (test_cco_vec.TestCCOVecSelfDup.testGatherv3) ... testAlltoallw (test_cco_vec.TestCCOVecSelfDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGatherv2 (test_cco_vec.TestCCOVecSelfDup.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_vec.TestCCOVecSelfDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_vec.TestCCOVecSelfDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec.TestCCOVecSelfDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_vec.TestCCOVecSelfDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec.TestCCOVecSelfDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_vec.TestCCOVecSelfDup.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_vec.TestCCOVecSelfDup.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_vec.TestCCOVecSelfDup.testScatterv2) ... ok testScatterv3 (test_cco_vec.TestCCOVecSelfDup.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec.TestCCOVecSelfDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_vec.TestCCOVecSelfDup.testScatterv3) ... ok testGatherv3 (test_cco_vec.TestCCOVecSelfDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_vec.TestCCOVecSelfDup.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_vec.TestCCOVecSelfDup.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_vec.TestCCOVecWorld.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_vec.TestCCOVecWorld.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec.TestCCOVecSelfDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_vec.TestCCOVecSelfDup.testScatterv2) ... ok testAllgatherv (test_cco_vec.TestCCOVecWorld.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_vec.TestCCOVecSelfDup.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_vec.TestCCOVecWorld.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_vec.TestCCOVecWorld.testAllgatherv2) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecWorld.testAllgatherv2) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecWorld.testAllgatherv2) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecWorld.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_vec.TestCCOVecWorld.testAllgatherv3) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecWorld.testAllgatherv3) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecWorld.testAllgatherv3) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecWorld.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec.TestCCOVecWorld.testAlltoallv) ... ok testAlltoallv (test_cco_vec.TestCCOVecWorld.testAlltoallv) ... ok testAlltoallv (test_cco_vec.TestCCOVecWorld.testAlltoallv) ... ok testAlltoallv (test_cco_vec.TestCCOVecWorld.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_vec.TestCCOVecWorld.testAlltoallv2) ... ok testAlltoallv2 (test_cco_vec.TestCCOVecWorld.testAlltoallv2) ... ok testAlltoallv2 (test_cco_vec.TestCCOVecWorld.testAlltoallv2) ... ok testAlltoallv2 (test_cco_vec.TestCCOVecWorld.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_vec.TestCCOVecWorld.testAlltoallv3) ... ok testAlltoallv3 (test_cco_vec.TestCCOVecWorld.testAlltoallv3) ... ok testAlltoallv3 (test_cco_vec.TestCCOVecWorld.testAlltoallv3) ... ok testAlltoallv3 (test_cco_vec.TestCCOVecWorld.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_vec.TestCCOVecWorld.testAlltoallw) ... ok testAlltoallw (test_cco_vec.TestCCOVecWorld.testAlltoallw) ... ok testAlltoallw (test_cco_vec.TestCCOVecWorld.testAlltoallw) ... testAlltoallw (test_cco_vec.TestCCOVecWorld.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_vec.TestCCOVecWorld.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_vec.TestCCOVecWorld.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_vec.TestCCOVecWorld.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_vec.TestCCOVecWorld.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_vec.TestCCOVecWorld.testGatherv) ... ok testGatherv (test_cco_vec.TestCCOVecWorld.testGatherv) ... ok testGatherv (test_cco_vec.TestCCOVecWorld.testGatherv) ... ok testGatherv (test_cco_vec.TestCCOVecWorld.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_vec.TestCCOVecWorld.testGatherv2) ... ok testGatherv2 (test_cco_vec.TestCCOVecWorld.testGatherv2) ... ok testGatherv2 (test_cco_vec.TestCCOVecWorld.testGatherv2) ... ok testGatherv2 (test_cco_vec.TestCCOVecWorld.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_vec.TestCCOVecWorld.testGatherv3) ... ok testGatherv3 (test_cco_vec.TestCCOVecWorld.testGatherv3) ... ok testGatherv3 (test_cco_vec.TestCCOVecWorld.testGatherv3) ... ok testGatherv3 (test_cco_vec.TestCCOVecWorld.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec.TestCCOVecWorld.testScatterv) ... ok testScatterv (test_cco_vec.TestCCOVecWorld.testScatterv) ... ok testScatterv (test_cco_vec.TestCCOVecWorld.testScatterv) ... ok testScatterv (test_cco_vec.TestCCOVecWorld.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_vec.TestCCOVecWorld.testScatterv2) ... ok testScatterv2 (test_cco_vec.TestCCOVecWorld.testScatterv2) ... ok testScatterv2 (test_cco_vec.TestCCOVecWorld.testScatterv2) ... ok testScatterv2 (test_cco_vec.TestCCOVecWorld.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_vec.TestCCOVecWorld.testScatterv3) ... ok testScatterv3 (test_cco_vec.TestCCOVecWorld.testScatterv3) ... ok testScatterv3 (test_cco_vec.TestCCOVecWorld.testScatterv3) ... ok testScatterv3 (test_cco_vec.TestCCOVecWorld.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_vec.TestCCOVecWorldDup.testAllgatherv) ... ok testAllgatherv (test_cco_vec.TestCCOVecWorldDup.testAllgatherv) ... ok testAllgatherv (test_cco_vec.TestCCOVecWorldDup.testAllgatherv) ... ok testAllgatherv (test_cco_vec.TestCCOVecWorldDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv2 (test_cco_vec.TestCCOVecWorldDup.testAllgatherv2) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecWorldDup.testAllgatherv2) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecWorldDup.testAllgatherv2) ... ok testAllgatherv2 (test_cco_vec.TestCCOVecWorldDup.testAllgatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv3 (test_cco_vec.TestCCOVecWorldDup.testAllgatherv3) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecWorldDup.testAllgatherv3) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecWorldDup.testAllgatherv3) ... ok testAllgatherv3 (test_cco_vec.TestCCOVecWorldDup.testAllgatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec.TestCCOVecWorldDup.testAlltoallv) ... ok testAlltoallv (test_cco_vec.TestCCOVecWorldDup.testAlltoallv) ... ok testAlltoallv (test_cco_vec.TestCCOVecWorldDup.testAlltoallv) ... ok testAlltoallv (test_cco_vec.TestCCOVecWorldDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv2 (test_cco_vec.TestCCOVecWorldDup.testAlltoallv2) ... ok testAlltoallv2 (test_cco_vec.TestCCOVecWorldDup.testAlltoallv2) ... ok testAlltoallv2 (test_cco_vec.TestCCOVecWorldDup.testAlltoallv2) ... ok testAlltoallv2 (test_cco_vec.TestCCOVecWorldDup.testAlltoallv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv3 (test_cco_vec.TestCCOVecWorldDup.testAlltoallv3) ... ok testAlltoallv3 (test_cco_vec.TestCCOVecWorldDup.testAlltoallv3) ... ok testAlltoallv3 (test_cco_vec.TestCCOVecWorldDup.testAlltoallv3) ... ok testAlltoallv3 (test_cco_vec.TestCCOVecWorldDup.testAlltoallv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_vec.TestCCOVecWorldDup.testAlltoallw) ... ok testAlltoallw (test_cco_vec.TestCCOVecWorldDup.testAlltoallw) ... ok testAlltoallw (test_cco_vec.TestCCOVecWorldDup.testAlltoallw) ... ok testAlltoallw (test_cco_vec.TestCCOVecWorldDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallwBottom (test_cco_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... ok testAlltoallwBottom (test_cco_vec.TestCCOVecWorldDup.testAlltoallwBottom) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_vec.TestCCOVecWorldDup.testGatherv) ... ok testGatherv (test_cco_vec.TestCCOVecWorldDup.testGatherv) ... ok testGatherv (test_cco_vec.TestCCOVecWorldDup.testGatherv) ... ok testGatherv (test_cco_vec.TestCCOVecWorldDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv2 (test_cco_vec.TestCCOVecWorldDup.testGatherv2) ... ok testGatherv2 (test_cco_vec.TestCCOVecWorldDup.testGatherv2) ... ok testGatherv2 (test_cco_vec.TestCCOVecWorldDup.testGatherv2) ... ok testGatherv2 (test_cco_vec.TestCCOVecWorldDup.testGatherv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv3 (test_cco_vec.TestCCOVecWorldDup.testGatherv3) ... ok testGatherv3 (test_cco_vec.TestCCOVecWorldDup.testGatherv3) ... ok testGatherv3 (test_cco_vec.TestCCOVecWorldDup.testGatherv3) ... ok testGatherv3 (test_cco_vec.TestCCOVecWorldDup.testGatherv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec.TestCCOVecWorldDup.testScatterv) ... ok testScatterv (test_cco_vec.TestCCOVecWorldDup.testScatterv) ... ok testScatterv (test_cco_vec.TestCCOVecWorldDup.testScatterv) ... ok testScatterv (test_cco_vec.TestCCOVecWorldDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv2 (test_cco_vec.TestCCOVecWorldDup.testScatterv2) ... ok testScatterv2 (test_cco_vec.TestCCOVecWorldDup.testScatterv2) ... ok testScatterv2 (test_cco_vec.TestCCOVecWorldDup.testScatterv2) ... ok testScatterv2 (test_cco_vec.TestCCOVecWorldDup.testScatterv2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv3 (test_cco_vec.TestCCOVecWorldDup.testScatterv3) ... ok testScatterv3 (test_cco_vec.TestCCOVecWorldDup.testScatterv3) ... ok testScatterv3 (test_cco_vec.TestCCOVecWorldDup.testScatterv3) ... ok testScatterv3 (test_cco_vec.TestCCOVecWorldDup.testScatterv3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_vec_inter.TestCCOVecInter.testAllgatherv) ... ok testAllgatherv (test_cco_vec_inter.TestCCOVecInter.testAllgatherv) ... ok testAllgatherv (test_cco_vec_inter.TestCCOVecInter.testAllgatherv) ... ok testAllgatherv (test_cco_vec_inter.TestCCOVecInter.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec_inter.TestCCOVecInter.testAlltoallv) ... ok testAlltoallv (test_cco_vec_inter.TestCCOVecInter.testAlltoallv) ... ok testAlltoallv (test_cco_vec_inter.TestCCOVecInter.testAlltoallv) ... ok testAlltoallv (test_cco_vec_inter.TestCCOVecInter.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_vec_inter.TestCCOVecInter.testAlltoallw) ... ok testAlltoallw (test_cco_vec_inter.TestCCOVecInter.testAlltoallw) ... ok testAlltoallw (test_cco_vec_inter.TestCCOVecInter.testAlltoallw) ... ok testAlltoallw (test_cco_vec_inter.TestCCOVecInter.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_vec_inter.TestCCOVecInter.testGatherv) ... ok testGatherv (test_cco_vec_inter.TestCCOVecInter.testGatherv) ... ok testGatherv (test_cco_vec_inter.TestCCOVecInter.testGatherv) ... ok testGatherv (test_cco_vec_inter.TestCCOVecInter.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec_inter.TestCCOVecInter.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec_inter.TestCCOVecInter.testScatterv) ... ok testScatterv (test_cco_vec_inter.TestCCOVecInter.testScatterv) ... ok testScatterv (test_cco_vec_inter.TestCCOVecInter.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherv (test_cco_vec_inter.TestCCOVecInterDup.testAllgatherv) ... ok testAllgatherv (test_cco_vec_inter.TestCCOVecInterDup.testAllgatherv) ... ok testAllgatherv (test_cco_vec_inter.TestCCOVecInterDup.testAllgatherv) ... ok testAllgatherv (test_cco_vec_inter.TestCCOVecInterDup.testAllgatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallv (test_cco_vec_inter.TestCCOVecInterDup.testAlltoallv) ... ok testAlltoallv (test_cco_vec_inter.TestCCOVecInterDup.testAlltoallv) ... ok testAlltoallv (test_cco_vec_inter.TestCCOVecInterDup.testAlltoallv) ... ok testAlltoallv (test_cco_vec_inter.TestCCOVecInterDup.testAlltoallv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallw (test_cco_vec_inter.TestCCOVecInterDup.testAlltoallw) ... ok testAlltoallw (test_cco_vec_inter.TestCCOVecInterDup.testAlltoallw) ... ok testAlltoallw (test_cco_vec_inter.TestCCOVecInterDup.testAlltoallw) ... ok testAlltoallw (test_cco_vec_inter.TestCCOVecInterDup.testAlltoallw) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherv (test_cco_vec_inter.TestCCOVecInterDup.testGatherv) ... ok testGatherv (test_cco_vec_inter.TestCCOVecInterDup.testGatherv) ... ok testGatherv (test_cco_vec_inter.TestCCOVecInterDup.testGatherv) ... ok testGatherv (test_cco_vec_inter.TestCCOVecInterDup.testGatherv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec_inter.TestCCOVecInterDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterv (test_cco_vec_inter.TestCCOVecInterDup.testScatterv) ... ok testScatterv (test_cco_vec_inter.TestCCOVecInterDup.testScatterv) ... ok testScatterv (test_cco_vec_inter.TestCCOVecInterDup.testScatterv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHandleAddress (test_cffi.TestCFFI.testHandleAddress) ... skipped 'cffi' testHandleValue (test_cffi.TestCFFI.testHandleValue) ... skipped 'cffi' testConstructor (test_comm.TestCommNull.testConstructor) ... ok testConstructorInter (test_comm.TestCommNull.testConstructorInter) ... ok testConstructorIntra (test_comm.TestCommNull.testConstructorIntra) ... ok testGetName (test_comm.TestCommNull.testGetName) ... ok testPickle (test_comm.TestCommNull.testPickle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHandleAddress (test_cffi.TestCFFI.testHandleAddress) ... skipped 'cffi' testHandleValue (test_cffi.TestCFFI.testHandleValue) ... skipped 'cffi' testConstructor (test_comm.TestCommNull.testConstructor) ... ok testConstructorInter (test_comm.TestCommNull.testConstructorInter) ... ok testConstructorIntra (test_comm.TestCommNull.testConstructorIntra) ... ok testGetName (test_comm.TestCommNull.testGetName) ... ok testPickle (test_comm.TestCommNull.testPickle) ... ok testHandleAddress (test_cffi.TestCFFI.testHandleAddress) ... skipped 'cffi' testHandleValue (test_cffi.TestCFFI.testHandleValue) ... skipped 'cffi' testConstructor (test_comm.TestCommNull.testConstructor) ... ok testConstructorInter (test_comm.TestCommNull.testConstructorInter) ... ok testConstructorIntra (test_comm.TestCommNull.testConstructorIntra) ... ok testGetName (test_comm.TestCommNull.testGetName) ... ok testPickle (test_comm.TestCommNull.testPickle) ... ok testHandleAddress (test_cffi.TestCFFI.testHandleAddress) ... skipped 'cffi' testHandleValue (test_cffi.TestCFFI.testHandleValue) ... skipped 'cffi' testConstructor (test_comm.TestCommNull.testConstructor) ... ok testConstructorInter (test_comm.TestCommNull.testConstructorInter) ... ok testConstructorIntra (test_comm.TestCommNull.testConstructorIntra) ... ok testGetName (test_comm.TestCommNull.testGetName) ... ok testPickle (test_comm.TestCommNull.testPickle) ... ok testBuffering (test_comm.TestCommSelf.testBuffering) ... ok testCloneFree (test_comm.TestCommSelf.testCloneFree) ... ok testCompare (test_comm.TestCommSelf.testCompare) ... ok testConstructor (test_comm.TestCommSelf.testConstructor) ... ok testCreate (test_comm.TestCommSelf.testCreate) ... ok testCreateFromGroup (test_comm.TestCommSelf.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommSelf.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommSelf.testDupWithInfo) ... ok testGetParent (test_comm.TestCommSelf.testGetParent) ... ok testGetSetInfo (test_comm.TestCommSelf.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommSelf.testGetSetName) ... ok testGroup (test_comm.TestCommSelf.testGroup) ... ok testIDup (test_comm.TestCommSelf.testIDup) ... ok testIDupWithInfo (test_comm.TestCommSelf.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommSelf.testIsInter) ... ok testPickle (test_comm.TestCommSelf.testPickle) ... ok testBuffering (test_comm.TestCommSelf.testBuffering) ... ok testCloneFree (test_comm.TestCommSelf.testCloneFree) ... ok testCompare (test_comm.TestCommSelf.testCompare) ... ok testConstructor (test_comm.TestCommSelf.testConstructor) ... ok testCreate (test_comm.TestCommSelf.testCreate) ... ok testCreateFromGroup (test_comm.TestCommSelf.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommSelf.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommSelf.testDupWithInfo) ... ok testGetParent (test_comm.TestCommSelf.testGetParent) ... ok testGetSetInfo (test_comm.TestCommSelf.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommSelf.testGetSetName) ... ok testGroup (test_comm.TestCommSelf.testGroup) ... ok testIDup (test_comm.TestCommSelf.testIDup) ... ok testIDupWithInfo (test_comm.TestCommSelf.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommSelf.testIsInter) ... ok testPickle (test_comm.TestCommSelf.testPickle) ... ok testBuffering (test_comm.TestCommSelf.testBuffering) ... ok testCloneFree (test_comm.TestCommSelf.testCloneFree) ... ok testCompare (test_comm.TestCommSelf.testCompare) ... ok testConstructor (test_comm.TestCommSelf.testConstructor) ... ok testCreate (test_comm.TestCommSelf.testCreate) ... ok testCreateFromGroup (test_comm.TestCommSelf.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommSelf.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommSelf.testDupWithInfo) ... ok testGetParent (test_comm.TestCommSelf.testGetParent) ... ok testGetSetInfo (test_comm.TestCommSelf.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommSelf.testGetSetName) ... ok testGroup (test_comm.TestCommSelf.testGroup) ... ok testIDup (test_comm.TestCommSelf.testIDup) ... ok testIDupWithInfo (test_comm.TestCommSelf.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommSelf.testIsInter) ... ok testPickle (test_comm.TestCommSelf.testPickle) ... ok testBuffering (test_comm.TestCommSelf.testBuffering) ... ok testCloneFree (test_comm.TestCommSelf.testCloneFree) ... ok testCompare (test_comm.TestCommSelf.testCompare) ... ok testConstructor (test_comm.TestCommSelf.testConstructor) ... ok testCreate (test_comm.TestCommSelf.testCreate) ... ok testCreateFromGroup (test_comm.TestCommSelf.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommSelf.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommSelf.testDupWithInfo) ... ok testGetParent (test_comm.TestCommSelf.testGetParent) ... ok testGetSetInfo (test_comm.TestCommSelf.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommSelf.testGetSetName) ... ok testGroup (test_comm.TestCommSelf.testGroup) ... ok testIDup (test_comm.TestCommSelf.testIDup) ... ok testIDupWithInfo (test_comm.TestCommSelf.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommSelf.testIsInter) ... ok testPickle (test_comm.TestCommSelf.testPickle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_comm.TestCommSelf.testPyProps) ... ok testRank (test_comm.TestCommSelf.testRank) ... ok testSize (test_comm.TestCommSelf.testSize) ... ok testSplit (test_comm.TestCommSelf.testSplit) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_comm.TestCommSelf.testPyProps) ... ok testRank (test_comm.TestCommSelf.testRank) ... ok testSize (test_comm.TestCommSelf.testSize) ... ok testSplit (test_comm.TestCommSelf.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommSelf.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommSelf.testSplitTypeHWUnguided) ... ok testSplitTypeResourceGuided (test_comm.TestCommSelf.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommSelf.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommSelfDup.testBuffering) ... ok testCloneFree (test_comm.TestCommSelfDup.testCloneFree) ... ok testCompare (test_comm.TestCommSelfDup.testCompare) ... ok testPyProps (test_comm.TestCommSelf.testPyProps) ... ok testRank (test_comm.TestCommSelf.testRank) ... ok testSize (test_comm.TestCommSelf.testSize) ... ok testSplit (test_comm.TestCommSelf.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommSelf.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommSelf.testSplitTypeHWUnguided) ... ok testSplitTypeResourceGuided (test_comm.TestCommSelf.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommSelf.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommSelfDup.testBuffering) ... ok testCloneFree (test_comm.TestCommSelfDup.testCloneFree) ... ok ok testPyProps (test_comm.TestCommSelf.testPyProps) ... ok testRank (test_comm.TestCommSelf.testRank) ... ok testSize (test_comm.TestCommSelf.testSize) ... ok testSplit (test_comm.TestCommSelf.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommSelf.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommSelf.testSplitTypeHWUnguided) ... ok testSplitTypeResourceGuided (test_comm.TestCommSelf.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommSelf.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommSelfDup.testBuffering) ... ok testCloneFree (test_comm.TestCommSelfDup.testCloneFree) ... ok testSplitTypeHWGuided (test_comm.TestCommSelf.testSplitTypeHWGuided) ... testCompare (test_comm.TestCommSelfDup.testCompare) ... ok testSplitTypeHWUnguided (test_comm.TestCommSelf.testSplitTypeHWUnguided) ... ok testSplitTypeResourceGuided (test_comm.TestCommSelf.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommSelf.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommSelfDup.testBuffering) ... ok testCloneFree (test_comm.TestCommSelfDup.testCloneFree) ... ok testCompare (test_comm.TestCommSelfDup.testCompare) ... ok testConstructor (test_comm.TestCommSelfDup.testConstructor) ... ok testCreate (test_comm.TestCommSelfDup.testCreate) ... ok testCreateFromGroup (test_comm.TestCommSelfDup.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommSelfDup.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommSelfDup.testDupWithInfo) ... ok testGetParent (test_comm.TestCommSelfDup.testGetParent) ... ok testGetSetInfo (test_comm.TestCommSelfDup.testGetSetInfo) ... ok testConstructor (test_comm.TestCommSelfDup.testConstructor) ... ok testCreate (test_comm.TestCommSelfDup.testCreate) ... ok testCreateFromGroup (test_comm.TestCommSelfDup.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommSelfDup.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommSelfDup.testDupWithInfo) ... ok testGetParent (test_comm.TestCommSelfDup.testGetParent) ... ok testGetSetInfo (test_comm.TestCommSelfDup.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommSelfDup.testGetSetName) ... ok testGroup (test_comm.TestCommSelfDup.testGroup) ... ok testIDup (test_comm.TestCommSelfDup.testIDup) ... ok testIDupWithInfo (test_comm.TestCommSelfDup.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommSelfDup.testIsInter) ... ok testPickle (test_comm.TestCommSelfDup.testPickle) ... ok testPyProps (test_comm.TestCommSelfDup.testPyProps) ... ok testRank (test_comm.TestCommSelfDup.testRank) ... ok testSize (test_comm.TestCommSelfDup.testSize) ... ok testConstructor (test_comm.TestCommSelfDup.testConstructor) ... ok testCreate (test_comm.TestCommSelfDup.testCreate) ... ok testCreateFromGroup (test_comm.TestCommSelfDup.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommSelfDup.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommSelfDup.testDupWithInfo) ... ok testGetParent (test_comm.TestCommSelfDup.testGetParent) ... ok testGetSetInfo (test_comm.TestCommSelfDup.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommSelfDup.testGetSetName) ... ok testGroup (test_comm.TestCommSelfDup.testGroup) ... ok testIDup (test_comm.TestCommSelfDup.testIDup) ... ok testIDupWithInfo (test_comm.TestCommSelfDup.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommSelfDup.testIsInter) ... ok testPickle (test_comm.TestCommSelfDup.testPickle) ... ok testPyProps (test_comm.TestCommSelfDup.testPyProps) ... ok testRank (test_comm.TestCommSelfDup.testRank) ... ok testSize (test_comm.TestCommSelfDup.testSize) ... ok testCompare (test_comm.TestCommSelfDup.testCompare) ... ok testConstructor (test_comm.TestCommSelfDup.testConstructor) ... ok testCreate (test_comm.TestCommSelfDup.testCreate) ... ok testCreateFromGroup (test_comm.TestCommSelfDup.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommSelfDup.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommSelfDup.testDupWithInfo) ... ok testGetParent (test_comm.TestCommSelfDup.testGetParent) ... ok testGetSetInfo (test_comm.TestCommSelfDup.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommSelfDup.testGetSetName) ... ok testGroup (test_comm.TestCommSelfDup.testGroup) ... ok testIDup (test_comm.TestCommSelfDup.testIDup) ... ok testIDupWithInfo (test_comm.TestCommSelfDup.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommSelfDup.testIsInter) ... ok testPickle (test_comm.TestCommSelfDup.testPickle) ... ok testPyProps (test_comm.TestCommSelfDup.testPyProps) ... ok testRank (test_comm.TestCommSelfDup.testRank) ... ok ok testSize (test_comm.TestCommSelfDup.testSize) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetName (test_comm.TestCommSelfDup.testGetSetName) ... ok testGroup (test_comm.TestCommSelfDup.testGroup) ... ok testIDup (test_comm.TestCommSelfDup.testIDup) ... ok testIDupWithInfo (test_comm.TestCommSelfDup.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommSelfDup.testIsInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSplit (test_comm.TestCommSelfDup.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommSelfDup.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommSelfDup.testSplitTypeHWUnguided) ... ok testSplitTypeResourceGuided (test_comm.TestCommSelfDup.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommSelfDup.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommWorld.testBuffering) ... testSplit (test_comm.TestCommSelfDup.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommSelfDup.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommSelfDup.testSplitTypeHWUnguided) ... ok testSplitTypeResourceGuided (test_comm.TestCommSelfDup.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommSelfDup.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommWorld.testBuffering) ... ok testSplit (test_comm.TestCommSelfDup.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommSelfDup.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommSelfDup.testSplitTypeHWUnguided) ... ok testSplitTypeResourceGuided (test_comm.TestCommSelfDup.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommSelfDup.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommWorld.testBuffering) ... ok testPickle (test_comm.TestCommSelfDup.testPickle) ... ok testPyProps (test_comm.TestCommSelfDup.testPyProps) ... ok testRank (test_comm.TestCommSelfDup.testRank) ... ok testSize (test_comm.TestCommSelfDup.testSize) ... ok testSplit (test_comm.TestCommSelfDup.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommSelfDup.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommSelfDup.testSplitTypeHWUnguided) ... ok testSplitTypeResourceGuided (test_comm.TestCommSelfDup.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommSelfDup.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommWorld.testBuffering) ... ok testCloneFree (test_comm.TestCommWorld.testCloneFree) ... ok testCompare (test_comm.TestCommWorld.testCompare) ... ok testConstructor (test_comm.TestCommWorld.testConstructor) ... ok ok testCloneFree (test_comm.TestCommWorld.testCloneFree) ... ok testCompare (test_comm.TestCommWorld.testCompare) ... ok testConstructor (test_comm.TestCommWorld.testConstructor) ... ok ok testCloneFree (test_comm.TestCommWorld.testCloneFree) ... ok testCompare (test_comm.TestCommWorld.testCompare) ... ok testConstructor (test_comm.TestCommWorld.testConstructor) ... ok ok testCloneFree (test_comm.TestCommWorld.testCloneFree) ... ok testCompare (test_comm.TestCommWorld.testCompare) ... ok testConstructor (test_comm.TestCommWorld.testConstructor) ... ok testCreate (test_comm.TestCommWorld.testCreate) ... testCreate (test_comm.TestCommWorld.testCreate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCreate (test_comm.TestCommWorld.testCreate) ... ok testCreateFromGroup (test_comm.TestCommWorld.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommWorld.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommWorld.testDupWithInfo) ... ok testGetParent (test_comm.TestCommWorld.testGetParent) ... ok testGetSetInfo (test_comm.TestCommWorld.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommWorld.testGetSetName) ... ok testGroup (test_comm.TestCommWorld.testGroup) ... ok testIDup (test_comm.TestCommWorld.testIDup) ... ok testIDupWithInfo (test_comm.TestCommWorld.testIDupWithInfo) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCreateFromGroup (test_comm.TestCommWorld.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommWorld.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommWorld.testDupWithInfo) ... ok testGetParent (test_comm.TestCommWorld.testGetParent) ... ok testGetSetInfo (test_comm.TestCommWorld.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommWorld.testGetSetName) ... ok testGroup (test_comm.TestCommWorld.testGroup) ... ok testIDup (test_comm.TestCommWorld.testIDup) ... ok testIDupWithInfo (test_comm.TestCommWorld.testIDupWithInfo) ... ok testCreate (test_comm.TestCommWorld.testCreate) ... ok testCreateFromGroup (test_comm.TestCommWorld.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommWorld.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommWorld.testDupWithInfo) ... ok testGetParent (test_comm.TestCommWorld.testGetParent) ... ok testGetSetInfo (test_comm.TestCommWorld.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommWorld.testGetSetName) ... ok testGroup (test_comm.TestCommWorld.testGroup) ... ok testIDup (test_comm.TestCommWorld.testIDup) ... ok testIDupWithInfo (test_comm.TestCommWorld.testIDupWithInfo) ... ok testCreateFromGroup (test_comm.TestCommWorld.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommWorld.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommWorld.testDupWithInfo) ... ok testGetParent (test_comm.TestCommWorld.testGetParent) ... ok testGetSetInfo (test_comm.TestCommWorld.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommWorld.testGetSetName) ... ok testGroup (test_comm.TestCommWorld.testGroup) ... ok testIDup (test_comm.TestCommWorld.testIDup) ... ok testIDupWithInfo (test_comm.TestCommWorld.testIDupWithInfo) ... ok ok testIsInter (test_comm.TestCommWorld.testIsInter) ... testIsInter (test_comm.TestCommWorld.testIsInter) ... testIsInter (test_comm.TestCommWorld.testIsInter) ... ok testPickle (test_comm.TestCommWorld.testPickle) ... ok testPyProps (test_comm.TestCommWorld.testPyProps) ... ok testRank (test_comm.TestCommWorld.testRank) ... ok testSize (test_comm.TestCommWorld.testSize) ... ok testSplit (test_comm.TestCommWorld.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommWorld.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommWorld.testSplitTypeHWUnguided) ... ok testPickle (test_comm.TestCommWorld.testPickle) ... ok testPyProps (test_comm.TestCommWorld.testPyProps) ... ok testRank (test_comm.TestCommWorld.testRank) ... ok testSize (test_comm.TestCommWorld.testSize) ... ok testSplit (test_comm.TestCommWorld.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommWorld.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommWorld.testSplitTypeHWUnguided) ... ok testIsInter (test_comm.TestCommWorld.testIsInter) ... ok testPickle (test_comm.TestCommWorld.testPickle) ... ok testPyProps (test_comm.TestCommWorld.testPyProps) ... ok testRank (test_comm.TestCommWorld.testRank) ... ok testSize (test_comm.TestCommWorld.testSize) ... ok testSplit (test_comm.TestCommWorld.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommWorld.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommWorld.testSplitTypeHWUnguided) ... ok testPickle (test_comm.TestCommWorld.testPickle) ... ok testPyProps (test_comm.TestCommWorld.testPyProps) ... ok testRank (test_comm.TestCommWorld.testRank) ... ok testSize (test_comm.TestCommWorld.testSize) ... ok testSplit (test_comm.TestCommWorld.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommWorld.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommWorld.testSplitTypeHWUnguided) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSplitTypeResourceGuided (test_comm.TestCommWorld.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommWorld.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommWorldDup.testBuffering) ... ok testSplitTypeResourceGuided (test_comm.TestCommWorld.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommWorld.testSplitTypeShared) ... ok ok testSplitTypeResourceGuided (test_comm.TestCommWorld.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommWorld.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommWorldDup.testBuffering) ... ok testSplitTypeResourceGuided (test_comm.TestCommWorld.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommWorld.testSplitTypeShared) ... ok testBuffering (test_comm.TestCommWorldDup.testBuffering) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCloneFree (test_comm.TestCommWorldDup.testCloneFree) ... ok testCompare (test_comm.TestCommWorldDup.testCompare) ... ok testConstructor (test_comm.TestCommWorldDup.testConstructor) ... ok testCreate (test_comm.TestCommWorldDup.testCreate) ... ok testCreateFromGroup (test_comm.TestCommWorldDup.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommWorldDup.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommWorldDup.testDupWithInfo) ... ok testGetParent (test_comm.TestCommWorldDup.testGetParent) ... ok testGetSetInfo (test_comm.TestCommWorldDup.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommWorldDup.testGetSetName) ... ok testGroup (test_comm.TestCommWorldDup.testGroup) ... ok testIDup (test_comm.TestCommWorldDup.testIDup) ... ok testBuffering (test_comm.TestCommWorldDup.testBuffering) ... ok testCloneFree (test_comm.TestCommWorldDup.testCloneFree) ... ok testCompare (test_comm.TestCommWorldDup.testCompare) ... ok testConstructor (test_comm.TestCommWorldDup.testConstructor) ... ok testCreate (test_comm.TestCommWorldDup.testCreate) ... ok testCreateFromGroup (test_comm.TestCommWorldDup.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommWorldDup.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommWorldDup.testDupWithInfo) ... ok testGetParent (test_comm.TestCommWorldDup.testGetParent) ... ok testGetSetInfo (test_comm.TestCommWorldDup.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommWorldDup.testGetSetName) ... ok testGroup (test_comm.TestCommWorldDup.testGroup) ... ok testIDup (test_comm.TestCommWorldDup.testIDup) ... ok ok testCloneFree (test_comm.TestCommWorldDup.testCloneFree) ... ok testCompare (test_comm.TestCommWorldDup.testCompare) ... ok testConstructor (test_comm.TestCommWorldDup.testConstructor) ... ok testCreate (test_comm.TestCommWorldDup.testCreate) ... ok testCreateFromGroup (test_comm.TestCommWorldDup.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommWorldDup.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommWorldDup.testDupWithInfo) ... ok testGetParent (test_comm.TestCommWorldDup.testGetParent) ... ok testGetSetInfo (test_comm.TestCommWorldDup.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommWorldDup.testGetSetName) ... ok testGroup (test_comm.TestCommWorldDup.testGroup) ... ok testIDup (test_comm.TestCommWorldDup.testIDup) ... ok testIDupWithInfo (test_comm.TestCommWorldDup.testIDupWithInfo) ... ok testCloneFree (test_comm.TestCommWorldDup.testCloneFree) ... ok testCompare (test_comm.TestCommWorldDup.testCompare) ... ok testConstructor (test_comm.TestCommWorldDup.testConstructor) ... ok testCreate (test_comm.TestCommWorldDup.testCreate) ... ok testCreateFromGroup (test_comm.TestCommWorldDup.testCreateFromGroup) ... ok testCreateGroup (test_comm.TestCommWorldDup.testCreateGroup) ... ok testDupWithInfo (test_comm.TestCommWorldDup.testDupWithInfo) ... ok testGetParent (test_comm.TestCommWorldDup.testGetParent) ... ok testGetSetInfo (test_comm.TestCommWorldDup.testGetSetInfo) ... ok testGetSetName (test_comm.TestCommWorldDup.testGetSetName) ... ok testGroup (test_comm.TestCommWorldDup.testGroup) ... ok testIDup (test_comm.TestCommWorldDup.testIDup) ... ok testIDupWithInfo (test_comm.TestCommWorldDup.testIDupWithInfo) ... testIDupWithInfo (test_comm.TestCommWorldDup.testIDupWithInfo) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIsInter (test_comm.TestCommWorldDup.testIsInter) ... ok testPickle (test_comm.TestCommWorldDup.testPickle) ... ok testPyProps (test_comm.TestCommWorldDup.testPyProps) ... ok testRank (test_comm.TestCommWorldDup.testRank) ... ok testSize (test_comm.TestCommWorldDup.testSize) ... ok testSplit (test_comm.TestCommWorldDup.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommWorldDup.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommWorldDup.testSplitTypeHWUnguided) ... testIDupWithInfo (test_comm.TestCommWorldDup.testIDupWithInfo) ... ok testIsInter (test_comm.TestCommWorldDup.testIsInter) ... ok testPickle (test_comm.TestCommWorldDup.testPickle) ... ok testPyProps (test_comm.TestCommWorldDup.testPyProps) ... ok testRank (test_comm.TestCommWorldDup.testRank) ... ok testSize (test_comm.TestCommWorldDup.testSize) ... ok testSplit (test_comm.TestCommWorldDup.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommWorldDup.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommWorldDup.testSplitTypeHWUnguided) ... ok testIsInter (test_comm.TestCommWorldDup.testIsInter) ... ok testPickle (test_comm.TestCommWorldDup.testPickle) ... ok testPyProps (test_comm.TestCommWorldDup.testPyProps) ... ok testRank (test_comm.TestCommWorldDup.testRank) ... ok testSize (test_comm.TestCommWorldDup.testSize) ... ok testSplit (test_comm.TestCommWorldDup.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommWorldDup.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommWorldDup.testSplitTypeHWUnguided) ... ok testIsInter (test_comm.TestCommWorldDup.testIsInter) ... ok testPickle (test_comm.TestCommWorldDup.testPickle) ... ok testPyProps (test_comm.TestCommWorldDup.testPyProps) ... ok testRank (test_comm.TestCommWorldDup.testRank) ... ok testSize (test_comm.TestCommWorldDup.testSize) ... ok testSplit (test_comm.TestCommWorldDup.testSplit) ... ok testSplitTypeHWGuided (test_comm.TestCommWorldDup.testSplitTypeHWGuided) ... ok testSplitTypeHWUnguided (test_comm.TestCommWorldDup.testSplitTypeHWUnguided) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSplitTypeResourceGuided (test_comm.TestCommWorldDup.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommWorldDup.testSplitTypeShared) ... ok testConstructor (test_comm_inter.TestIntercomm.testConstructor) ... ok testCreateFromGroups (test_comm_inter.TestIntercomm.testCreateFromGroups) ... ok testFortran (test_comm_inter.TestIntercomm.testFortran) ... ok testLocalGroupSizeRank (test_comm_inter.TestIntercomm.testLocalGroupSizeRank) ... ok testMerge (test_comm_inter.TestIntercomm.testMerge) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSplitTypeResourceGuided (test_comm.TestCommWorldDup.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommWorldDup.testSplitTypeShared) ... ok testConstructor (test_comm_inter.TestIntercomm.testConstructor) ... ok testCreateFromGroups (test_comm_inter.TestIntercomm.testCreateFromGroups) ... ok testFortran (test_comm_inter.TestIntercomm.testFortran) ... ok testLocalGroupSizeRank (test_comm_inter.TestIntercomm.testLocalGroupSizeRank) ... ok testMerge (test_comm_inter.TestIntercomm.testMerge) ... ok testSplitTypeResourceGuided (test_comm.TestCommWorldDup.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommWorldDup.testSplitTypeShared) ... ok testConstructor (test_comm_inter.TestIntercomm.testConstructor) ... ok testCreateFromGroups (test_comm_inter.TestIntercomm.testCreateFromGroups) ... ok testFortran (test_comm_inter.TestIntercomm.testFortran) ... ok testLocalGroupSizeRank (test_comm_inter.TestIntercomm.testLocalGroupSizeRank) ... ok testMerge (test_comm_inter.TestIntercomm.testMerge) ... ok testSplitTypeResourceGuided (test_comm.TestCommWorldDup.testSplitTypeResourceGuided) ... ok testSplitTypeShared (test_comm.TestCommWorldDup.testSplitTypeShared) ... ok testConstructor (test_comm_inter.TestIntercomm.testConstructor) ... ok testCreateFromGroups (test_comm_inter.TestIntercomm.testCreateFromGroups) ... ok testFortran (test_comm_inter.TestIntercomm.testFortran) ... ok testLocalGroupSizeRank (test_comm_inter.TestIntercomm.testLocalGroupSizeRank) ... ok testMerge (test_comm_inter.TestIntercomm.testMerge) ... ok testPyProps (test_comm_inter.TestIntercomm.testPyProps) ... ok testRemoteGroupSize (test_comm_inter.TestIntercomm.testRemoteGroupSize) ... ok testSplit (test_comm_inter.TestIntercomm.testSplit) ... ok testSplitTypeShared (test_comm_inter.TestIntercomm.testSplitTypeShared) ... ok testHalf (test_comm_inter.TestIntercommCreateFromGroups.testHalf) ... ok testPair (test_comm_inter.TestIntercommCreateFromGroups.testPair) ... ok testConstructor (test_comm_inter.TestIntercommDup.testConstructor) ... ok testCreateFromGroups (test_comm_inter.TestIntercommDup.testCreateFromGroups) ... ok testFortran (test_comm_inter.TestIntercommDup.testFortran) ... ok testLocalGroupSizeRank (test_comm_inter.TestIntercommDup.testLocalGroupSizeRank) ... ok testPyProps (test_comm_inter.TestIntercomm.testPyProps) ... ok testRemoteGroupSize (test_comm_inter.TestIntercomm.testRemoteGroupSize) ... ok testSplit (test_comm_inter.TestIntercomm.testSplit) ... ok testSplitTypeShared (test_comm_inter.TestIntercomm.testSplitTypeShared) ... ok testHalf (test_comm_inter.TestIntercommCreateFromGroups.testHalf) ... ok testPair (test_comm_inter.TestIntercommCreateFromGroups.testPair) ... ok testConstructor (test_comm_inter.TestIntercommDup.testConstructor) ... ok testCreateFromGroups (test_comm_inter.TestIntercommDup.testCreateFromGroups) ... ok testFortran (test_comm_inter.TestIntercommDup.testFortran) ... ok testLocalGroupSizeRank (test_comm_inter.TestIntercommDup.testLocalGroupSizeRank) ... ok testPyProps (test_comm_inter.TestIntercomm.testPyProps) ... ok testRemoteGroupSize (test_comm_inter.TestIntercomm.testRemoteGroupSize) ... ok testSplit (test_comm_inter.TestIntercomm.testSplit) ... ok testSplitTypeShared (test_comm_inter.TestIntercomm.testSplitTypeShared) ... ok testHalf (test_comm_inter.TestIntercommCreateFromGroups.testHalf) ... ok testPair (test_comm_inter.TestIntercommCreateFromGroups.testPair) ... ok testConstructor (test_comm_inter.TestIntercommDup.testConstructor) ... ok testCreateFromGroups (test_comm_inter.TestIntercommDup.testCreateFromGroups) ... ok testFortran (test_comm_inter.TestIntercommDup.testFortran) ... ok testLocalGroupSizeRank (test_comm_inter.TestIntercommDup.testLocalGroupSizeRank) ... ok testPyProps (test_comm_inter.TestIntercomm.testPyProps) ... ok testRemoteGroupSize (test_comm_inter.TestIntercomm.testRemoteGroupSize) ... ok testSplit (test_comm_inter.TestIntercomm.testSplit) ... ok testSplitTypeShared (test_comm_inter.TestIntercomm.testSplitTypeShared) ... ok testHalf (test_comm_inter.TestIntercommCreateFromGroups.testHalf) ... ok testPair (test_comm_inter.TestIntercommCreateFromGroups.testPair) ... ok testConstructor (test_comm_inter.TestIntercommDup.testConstructor) ... ok testCreateFromGroups (test_comm_inter.TestIntercommDup.testCreateFromGroups) ... ok testFortran (test_comm_inter.TestIntercommDup.testFortran) ... ok testLocalGroupSizeRank (test_comm_inter.TestIntercommDup.testLocalGroupSizeRank) ... ok ok testMerge (test_comm_inter.TestIntercommDup.testMerge) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testMerge (test_comm_inter.TestIntercommDup.testMerge) ... ok testPyProps (test_comm_inter.TestIntercommDup.testPyProps) ... ok testRemoteGroupSize (test_comm_inter.TestIntercommDup.testRemoteGroupSize) ... ok testSplit (test_comm_inter.TestIntercommDup.testSplit) ... ok testSplitTypeShared (test_comm_inter.TestIntercommDup.testSplitTypeShared) ... ok testConstructor (test_comm_inter.TestIntercommDupDup.testConstructor) ... ok testCreateFromGroups (test_comm_inter.TestIntercommDupDup.testCreateFromGroups) ... ok testFortran (test_comm_inter.TestIntercommDupDup.testFortran) ... ok testLocalGroupSizeRank (test_comm_inter.TestIntercommDupDup.testLocalGroupSizeRank) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMerge (test_comm_inter.TestIntercommDup.testMerge) ... ok testPyProps (test_comm_inter.TestIntercommDup.testPyProps) ... ok testRemoteGroupSize (test_comm_inter.TestIntercommDup.testRemoteGroupSize) ... ok testSplit (test_comm_inter.TestIntercommDup.testSplit) ... ok testSplitTypeShared (test_comm_inter.TestIntercommDup.testSplitTypeShared) ... ok testConstructor (test_comm_inter.TestIntercommDupDup.testConstructor) ... ok testCreateFromGroups (test_comm_inter.TestIntercommDupDup.testCreateFromGroups) ... ok testFortran (test_comm_inter.TestIntercommDupDup.testFortran) ... ok testLocalGroupSizeRank (test_comm_inter.TestIntercommDupDup.testLocalGroupSizeRank) ... ok testMerge (test_comm_inter.TestIntercommDup.testMerge) ... ok testPyProps (test_comm_inter.TestIntercommDup.testPyProps) ... ok testRemoteGroupSize (test_comm_inter.TestIntercommDup.testRemoteGroupSize) ... ok testSplit (test_comm_inter.TestIntercommDup.testSplit) ... ok testSplitTypeShared (test_comm_inter.TestIntercommDup.testSplitTypeShared) ... ok testConstructor (test_comm_inter.TestIntercommDupDup.testConstructor) ... ok testCreateFromGroups (test_comm_inter.TestIntercommDupDup.testCreateFromGroups) ... ok testFortran (test_comm_inter.TestIntercommDupDup.testFortran) ... ok testLocalGroupSizeRank (test_comm_inter.TestIntercommDupDup.testLocalGroupSizeRank) ... ok testPyProps (test_comm_inter.TestIntercommDup.testPyProps) ... ok testRemoteGroupSize (test_comm_inter.TestIntercommDup.testRemoteGroupSize) ... ok testSplit (test_comm_inter.TestIntercommDup.testSplit) ... ok testSplitTypeShared (test_comm_inter.TestIntercommDup.testSplitTypeShared) ... ok testConstructor (test_comm_inter.TestIntercommDupDup.testConstructor) ... ok testCreateFromGroups (test_comm_inter.TestIntercommDupDup.testCreateFromGroups) ... ok testFortran (test_comm_inter.TestIntercommDupDup.testFortran) ... ok testLocalGroupSizeRank (test_comm_inter.TestIntercommDupDup.testLocalGroupSizeRank) ... ok testMerge (test_comm_inter.TestIntercommDupDup.testMerge) ... ok testPyProps (test_comm_inter.TestIntercommDupDup.testPyProps) ... ok testRemoteGroupSize (test_comm_inter.TestIntercommDupDup.testRemoteGroupSize) ... ok testSplit (test_comm_inter.TestIntercommDupDup.testSplit) ... ok testSplitTypeShared (test_comm_inter.TestIntercommDupDup.testSplitTypeShared) ... ok testConstructorCartcomm (test_comm_topo.TestTopoConstructor.testConstructorCartcomm) ... ok testConstructorDistGraphcomm (test_comm_topo.TestTopoConstructor.testConstructorDistGraphcomm) ... ok testConstructorGraphcomm (test_comm_topo.TestTopoConstructor.testConstructorGraphcomm) ... ok testConstructorTopocomm (test_comm_topo.TestTopoConstructor.testConstructorTopocomm) ... ok testCartMap (test_comm_topo.TestTopoSelf.testCartMap) ... ok testMerge (test_comm_inter.TestIntercommDupDup.testMerge) ... ok testPyProps (test_comm_inter.TestIntercommDupDup.testPyProps) ... ok testRemoteGroupSize (test_comm_inter.TestIntercommDupDup.testRemoteGroupSize) ... ok testSplit (test_comm_inter.TestIntercommDupDup.testSplit) ... ok testSplitTypeShared (test_comm_inter.TestIntercommDupDup.testSplitTypeShared) ... ok testConstructorCartcomm (test_comm_topo.TestTopoConstructor.testConstructorCartcomm) ... ok testConstructorDistGraphcomm (test_comm_topo.TestTopoConstructor.testConstructorDistGraphcomm) ... ok testConstructorGraphcomm (test_comm_topo.TestTopoConstructor.testConstructorGraphcomm) ... ok testConstructorTopocomm (test_comm_topo.TestTopoConstructor.testConstructorTopocomm) ... ok testCartMap (test_comm_topo.TestTopoSelf.testCartMap) ... ok ok testMerge (test_comm_inter.TestIntercommDupDup.testMerge) ... ok testPyProps (test_comm_inter.TestIntercommDupDup.testPyProps) ... ok testRemoteGroupSize (test_comm_inter.TestIntercommDupDup.testRemoteGroupSize) ... ok testSplit (test_comm_inter.TestIntercommDupDup.testSplit) ... ok testSplitTypeShared (test_comm_inter.TestIntercommDupDup.testSplitTypeShared) ... ok testConstructorCartcomm (test_comm_topo.TestTopoConstructor.testConstructorCartcomm) ... ok testConstructorDistGraphcomm (test_comm_topo.TestTopoConstructor.testConstructorDistGraphcomm) ... ok testConstructorGraphcomm (test_comm_topo.TestTopoConstructor.testConstructorGraphcomm) ... ok testConstructorTopocomm (test_comm_topo.TestTopoConstructor.testConstructorTopocomm) ... ok testCartMap (test_comm_topo.TestTopoSelf.testCartMap) ... ok testMerge (test_comm_inter.TestIntercommDupDup.testMerge) ... ok testPyProps (test_comm_inter.TestIntercommDupDup.testPyProps) ... ok testRemoteGroupSize (test_comm_inter.TestIntercommDupDup.testRemoteGroupSize) ... ok testSplit (test_comm_inter.TestIntercommDupDup.testSplit) ... ok testSplitTypeShared (test_comm_inter.TestIntercommDupDup.testSplitTypeShared) ... ok testConstructorCartcomm (test_comm_topo.TestTopoConstructor.testConstructorCartcomm) ... ok testConstructorDistGraphcomm (test_comm_topo.TestTopoConstructor.testConstructorDistGraphcomm) ... ok testConstructorGraphcomm (test_comm_topo.TestTopoConstructor.testConstructorGraphcomm) ... ok testConstructorTopocomm (test_comm_topo.TestTopoConstructor.testConstructorTopocomm) ... ok testCartMap (test_comm_topo.TestTopoSelf.testCartMap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCartcomm (test_comm_topo.TestTopoSelf.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoSelf.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoSelf.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoSelf.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoSelf.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoSelf.testGraphcomm) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCartcomm (test_comm_topo.TestTopoSelf.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoSelf.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoSelf.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoSelf.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoSelf.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoSelf.testGraphcomm) ... ok testCartMap (test_comm_topo.TestTopoSelfDup.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoSelf.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoSelf.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoSelf.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoSelf.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoSelf.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoSelf.testGraphcomm) ... ok testCartMap (test_comm_topo.TestTopoSelfDup.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoSelf.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoSelf.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoSelf.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoSelf.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoSelf.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoSelf.testGraphcomm) ... ok ok testCartMap (test_comm_topo.TestTopoSelfDup.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoSelfDup.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoSelfDup.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoSelfDup.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoSelfDup.testDistgraphcommAdjacent) ... ok ok testCartcomm (test_comm_topo.TestTopoSelfDup.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoSelfDup.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoSelfDup.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoSelfDup.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoSelfDup.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoSelfDup.testGraphcomm) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCartcomm (test_comm_topo.TestTopoSelfDup.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoSelfDup.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoSelfDup.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoSelfDup.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoSelfDup.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoSelfDup.testGraphcomm) ... ok testCartMap (test_comm_topo.TestTopoWorld.testCartMap) ... testCartMap (test_comm_topo.TestTopoSelfDup.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoSelfDup.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoSelfDup.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoSelfDup.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoSelfDup.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoSelfDup.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoSelfDup.testGraphcomm) ... testCartMap (test_comm_topo.TestTopoWorld.testCartMap) ... testGraphMap (test_comm_topo.TestTopoSelfDup.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoSelfDup.testGraphcomm) ... ok testCartMap (test_comm_topo.TestTopoWorld.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoWorld.testCartcomm) ... ok testCartcomm (test_comm_topo.TestTopoWorld.testCartcomm) ... ok testCartcomm (test_comm_topo.TestTopoWorld.testCartcomm) ... ok testCartMap (test_comm_topo.TestTopoWorld.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoWorld.testCartcomm) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCartcommZeroDim (test_comm_topo.TestTopoWorld.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoWorld.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoWorld.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoWorld.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoWorld.testGraphcomm) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCartcommZeroDim (test_comm_topo.TestTopoWorld.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoWorld.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoWorld.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoWorld.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoWorld.testGraphcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoWorld.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoWorld.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoWorld.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoWorld.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoWorld.testGraphcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoWorld.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoWorld.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoWorld.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoWorld.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoWorld.testGraphcomm) ... ok testCartMap (test_comm_topo.TestTopoWorldDup.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoWorldDup.testCartcomm) ... ok ok testCartMap (test_comm_topo.TestTopoWorldDup.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoWorldDup.testCartcomm) ... ok ok testCartMap (test_comm_topo.TestTopoWorldDup.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoWorldDup.testCartcomm) ... ok ok testCartMap (test_comm_topo.TestTopoWorldDup.testCartMap) ... ok testCartcomm (test_comm_topo.TestTopoWorldDup.testCartcomm) ... ok testCartcommZeroDim (test_comm_topo.TestTopoWorldDup.testCartcommZeroDim) ... testCartcommZeroDim (test_comm_topo.TestTopoWorldDup.testCartcommZeroDim) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDistgraphcomm (test_comm_topo.TestTopoWorldDup.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoWorldDup.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoWorldDup.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoWorldDup.testGraphcomm) ... ok testHandleAdress (test_ctypes.TestCTYPES.testHandleAdress) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCartcommZeroDim (test_comm_topo.TestTopoWorldDup.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoWorldDup.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoWorldDup.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoWorldDup.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoWorldDup.testGraphcomm) ... ok testHandleAdress (test_ctypes.TestCTYPES.testHandleAdress) ... testCartcommZeroDim (test_comm_topo.TestTopoWorldDup.testCartcommZeroDim) ... ok testDistgraphcomm (test_comm_topo.TestTopoWorldDup.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoWorldDup.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoWorldDup.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoWorldDup.testGraphcomm) ... ok testHandleAdress (test_ctypes.TestCTYPES.testHandleAdress) ... ok testDistgraphcomm (test_comm_topo.TestTopoWorldDup.testDistgraphcomm) ... ok testDistgraphcommAdjacent (test_comm_topo.TestTopoWorldDup.testDistgraphcommAdjacent) ... ok testGraphMap (test_comm_topo.TestTopoWorldDup.testGraphMap) ... ok testGraphcomm (test_comm_topo.TestTopoWorldDup.testGraphcomm) ... ok testHandleAdress (test_ctypes.TestCTYPES.testHandleAdress) ... ok testHandleValue (test_ctypes.TestCTYPES.testHandleValue) ... ok testBoolEqNe (test_datatype.TestDatatype.testBoolEqNe) ... ok testCodeCharStr (test_datatype.TestDatatype.testCodeCharStr) ... ok testHandleValue (test_ctypes.TestCTYPES.testHandleValue) ... ok testBoolEqNe (test_datatype.TestDatatype.testBoolEqNe) ... ok testCodeCharStr (test_datatype.TestDatatype.testCodeCharStr) ... ok testHandleValue (test_ctypes.TestCTYPES.testHandleValue) ... ok testBoolEqNe (test_datatype.TestDatatype.testBoolEqNe) ... ok testCodeCharStr (test_datatype.TestDatatype.testCodeCharStr) ... ok testHandleValue (test_ctypes.TestCTYPES.testHandleValue) ... ok testBoolEqNe (test_datatype.TestDatatype.testBoolEqNe) ... ok testCodeCharStr (test_datatype.TestDatatype.testCodeCharStr) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommit (test_datatype.TestDatatype.testCommit) ... ok testGetEnvelope (test_datatype.TestDatatype.testGetEnvelope) ... ok testGetExtent (test_datatype.TestDatatype.testGetExtent) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommit (test_datatype.TestDatatype.testCommit) ... ok testGetEnvelope (test_datatype.TestDatatype.testGetEnvelope) ... ok testCommit (test_datatype.TestDatatype.testCommit) ... ok testGetEnvelope (test_datatype.TestDatatype.testGetEnvelope) ... ok ok testCommit (test_datatype.TestDatatype.testCommit) ... ok testGetEnvelope (test_datatype.TestDatatype.testGetEnvelope) ... ok testGetSetName (test_datatype.TestDatatype.testGetSetName) ... ok testGetSize (test_datatype.TestDatatype.testGetSize) ... ok testGetTrueExtent (test_datatype.TestDatatype.testGetTrueExtent) ... ok testGetValueIndex (test_datatype.TestDatatype.testGetValueIndex) ... ok testMatchSize (test_datatype.TestDatatype.testMatchSize) ... ok testContiguous (test_datatype.TestDatatypeCreate.testContiguous) ... ok testGetExtent (test_datatype.TestDatatype.testGetExtent) ... ok testGetSetName (test_datatype.TestDatatype.testGetSetName) ... ok testGetSize (test_datatype.TestDatatype.testGetSize) ... ok testGetTrueExtent (test_datatype.TestDatatype.testGetTrueExtent) ... ok testGetValueIndex (test_datatype.TestDatatype.testGetValueIndex) ... ok testMatchSize (test_datatype.TestDatatype.testMatchSize) ... ok testContiguous (test_datatype.TestDatatypeCreate.testContiguous) ... testGetExtent (test_datatype.TestDatatype.testGetExtent) ... ok testGetSetName (test_datatype.TestDatatype.testGetSetName) ... ok testGetSize (test_datatype.TestDatatype.testGetSize) ... ok testGetTrueExtent (test_datatype.TestDatatype.testGetTrueExtent) ... ok testGetValueIndex (test_datatype.TestDatatype.testGetValueIndex) ... ok testMatchSize (test_datatype.TestDatatype.testMatchSize) ... ok testContiguous (test_datatype.TestDatatypeCreate.testContiguous) ... ok testGetExtent (test_datatype.TestDatatype.testGetExtent) ... ok testGetSetName (test_datatype.TestDatatype.testGetSetName) ... ok testGetSize (test_datatype.TestDatatype.testGetSize) ... ok testGetTrueExtent (test_datatype.TestDatatype.testGetTrueExtent) ... ok testGetValueIndex (test_datatype.TestDatatype.testGetValueIndex) ... ok testMatchSize (test_datatype.TestDatatype.testMatchSize) ... ok testContiguous (test_datatype.TestDatatypeCreate.testContiguous) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarray (test_datatype.TestDatatypeCreate.testDarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarray (test_datatype.TestDatatypeCreate.testDarray) ... ok testDarray (test_datatype.TestDatatypeCreate.testDarray) ... ok testDarray (test_datatype.TestDatatypeCreate.testDarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDup (test_datatype.TestDatatypeCreate.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90ComplexDouble (test_datatype.TestDatatypeCreate.testF90ComplexDouble) ... ok testF90ComplexSingle (test_datatype.TestDatatypeCreate.testF90ComplexSingle) ... ok testF90Integer (test_datatype.TestDatatypeCreate.testF90Integer) ... ok testF90RealDouble (test_datatype.TestDatatypeCreate.testF90RealDouble) ... ok testF90RealSingle (test_datatype.TestDatatypeCreate.testF90RealSingle) ... ok testHindexed (test_datatype.TestDatatypeCreate.testHindexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDup (test_datatype.TestDatatypeCreate.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDup (test_datatype.TestDatatypeCreate.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90ComplexDouble (test_datatype.TestDatatypeCreate.testF90ComplexDouble) ... ok testF90ComplexSingle (test_datatype.TestDatatypeCreate.testF90ComplexSingle) ... ok testF90Integer (test_datatype.TestDatatypeCreate.testF90Integer) ... ok testF90RealDouble (test_datatype.TestDatatypeCreate.testF90RealDouble) ... ok testF90RealSingle (test_datatype.TestDatatypeCreate.testF90RealSingle) ... ok testHindexed (test_datatype.TestDatatypeCreate.testHindexed) ... ok testF90ComplexDouble (test_datatype.TestDatatypeCreate.testF90ComplexDouble) ... ok testF90ComplexSingle (test_datatype.TestDatatypeCreate.testF90ComplexSingle) ... ok testF90Integer (test_datatype.TestDatatypeCreate.testF90Integer) ... ok testF90RealDouble (test_datatype.TestDatatypeCreate.testF90RealDouble) ... ok testF90RealSingle (test_datatype.TestDatatypeCreate.testF90RealSingle) ... ok testHindexed (test_datatype.TestDatatypeCreate.testHindexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDup (test_datatype.TestDatatypeCreate.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90ComplexDouble (test_datatype.TestDatatypeCreate.testF90ComplexDouble) ... ok testF90ComplexSingle (test_datatype.TestDatatypeCreate.testF90ComplexSingle) ... ok testF90Integer (test_datatype.TestDatatypeCreate.testF90Integer) ... ok testF90RealDouble (test_datatype.TestDatatypeCreate.testF90RealDouble) ... ok testF90RealSingle (test_datatype.TestDatatypeCreate.testF90RealSingle) ... ok testHindexed (test_datatype.TestDatatypeCreate.testHindexed) ... ok testHindexedBlock (test_datatype.TestDatatypeCreate.testHindexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHindexedBlock (test_datatype.TestDatatypeCreate.testHindexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHindexedBlock (test_datatype.TestDatatypeCreate.testHindexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHvector (test_datatype.TestDatatypeCreate.testHvector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHindexedBlock (test_datatype.TestDatatypeCreate.testHindexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHvector (test_datatype.TestDatatypeCreate.testHvector) ... ok testHvector (test_datatype.TestDatatypeCreate.testHvector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHvector (test_datatype.TestDatatypeCreate.testHvector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_datatype.TestDatatypeCreate.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_datatype.TestDatatypeCreate.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_datatype.TestDatatypeCreate.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexedBlock (test_datatype.TestDatatypeCreate.testIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexedBlock (test_datatype.TestDatatypeCreate.testIndexedBlock) ... ok testIndexedBlock (test_datatype.TestDatatypeCreate.testIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_datatype.TestDatatypeCreate.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testResized (test_datatype.TestDatatypeCreate.testResized) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testResized (test_datatype.TestDatatypeCreate.testResized) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testResized (test_datatype.TestDatatypeCreate.testResized) ... ok testIndexedBlock (test_datatype.TestDatatypeCreate.testIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testResized (test_datatype.TestDatatypeCreate.testResized) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_datatype.TestDatatypeCreate.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_datatype.TestDatatypeCreate.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_datatype.TestDatatypeCreate.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_datatype.TestDatatypeCreate.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_datatype.TestDatatypeCreate.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_datatype.TestDatatypeCreate.testSubarray) ... ok testSubarray (test_datatype.TestDatatypeCreate.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_datatype.TestDatatypeCreate.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testValueIndex (test_datatype.TestDatatypeCreate.testValueIndex) ... ok testVector (test_datatype.TestDatatypeCreate.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testValueIndex (test_datatype.TestDatatypeCreate.testValueIndex) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testValueIndex (test_datatype.TestDatatypeCreate.testValueIndex) ... ok testVector (test_datatype.TestDatatypeCreate.testVector) ... ok testVector (test_datatype.TestDatatypeCreate.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testValueIndex (test_datatype.TestDatatypeCreate.testValueIndex) ... ok testVector (test_datatype.TestDatatypeCreate.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_datatype.TestDatatypeNull.testConstructor) ... ok testGetName (test_datatype.TestDatatypeNull.testGetName) ... ok testContiguous (test_datatype.TestDatatypePickle.testContiguous) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_datatype.TestDatatypeNull.testConstructor) ... ok testGetName (test_datatype.TestDatatypeNull.testGetName) ... ok testContiguous (test_datatype.TestDatatypePickle.testContiguous) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_datatype.TestDatatypeNull.testConstructor) ... ok testGetName (test_datatype.TestDatatypeNull.testGetName) ... ok testContiguous (test_datatype.TestDatatypePickle.testContiguous) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarray (test_datatype.TestDatatypePickle.testDarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_datatype.TestDatatypeNull.testConstructor) ... ok testGetName (test_datatype.TestDatatypeNull.testGetName) ... ok testContiguous (test_datatype.TestDatatypePickle.testContiguous) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarray (test_datatype.TestDatatypePickle.testDarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarray (test_datatype.TestDatatypePickle.testDarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarray (test_datatype.TestDatatypePickle.testDarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDup (test_datatype.TestDatatypePickle.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90ComplexDouble (test_datatype.TestDatatypePickle.testF90ComplexDouble) ... ok testF90ComplexSingle (test_datatype.TestDatatypePickle.testF90ComplexSingle) ... ok testF90Integer (test_datatype.TestDatatypePickle.testF90Integer) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90RealDouble (test_datatype.TestDatatypePickle.testF90RealDouble) ... ok testF90RealSingle (test_datatype.TestDatatypePickle.testF90RealSingle) ... ok testHindexed (test_datatype.TestDatatypePickle.testHindexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDup (test_datatype.TestDatatypePickle.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90ComplexDouble (test_datatype.TestDatatypePickle.testF90ComplexDouble) ... ok testF90ComplexSingle (test_datatype.TestDatatypePickle.testF90ComplexSingle) ... ok testF90Integer (test_datatype.TestDatatypePickle.testF90Integer) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90RealDouble (test_datatype.TestDatatypePickle.testF90RealDouble) ... ok testF90RealSingle (test_datatype.TestDatatypePickle.testF90RealSingle) ... ok testHindexed (test_datatype.TestDatatypePickle.testHindexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDup (test_datatype.TestDatatypePickle.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90ComplexDouble (test_datatype.TestDatatypePickle.testF90ComplexDouble) ... ok testF90ComplexSingle (test_datatype.TestDatatypePickle.testF90ComplexSingle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHindexedBlock (test_datatype.TestDatatypePickle.testHindexedBlock) ... ok testF90Integer (test_datatype.TestDatatypePickle.testF90Integer) ... ok testF90RealDouble (test_datatype.TestDatatypePickle.testF90RealDouble) ... ok testF90RealSingle (test_datatype.TestDatatypePickle.testF90RealSingle) ... ok testHindexed (test_datatype.TestDatatypePickle.testHindexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHindexedBlock (test_datatype.TestDatatypePickle.testHindexedBlock) ... ok testDup (test_datatype.TestDatatypePickle.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90ComplexDouble (test_datatype.TestDatatypePickle.testF90ComplexDouble) ... ok testF90ComplexSingle (test_datatype.TestDatatypePickle.testF90ComplexSingle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90Integer (test_datatype.TestDatatypePickle.testF90Integer) ... ok testF90RealDouble (test_datatype.TestDatatypePickle.testF90RealDouble) ... ok testF90RealSingle (test_datatype.TestDatatypePickle.testF90RealSingle) ... ok testHindexed (test_datatype.TestDatatypePickle.testHindexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHvector (test_datatype.TestDatatypePickle.testHvector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHindexedBlock (test_datatype.TestDatatypePickle.testHindexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHvector (test_datatype.TestDatatypePickle.testHvector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHindexedBlock (test_datatype.TestDatatypePickle.testHindexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHvector (test_datatype.TestDatatypePickle.testHvector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHvector (test_datatype.TestDatatypePickle.testHvector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_datatype.TestDatatypePickle.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_datatype.TestDatatypePickle.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexedBlock (test_datatype.TestDatatypePickle.testIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_datatype.TestDatatypePickle.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexedBlock (test_datatype.TestDatatypePickle.testIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNamed (test_datatype.TestDatatypePickle.testNamed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testResized (test_datatype.TestDatatypePickle.testResized) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexedBlock (test_datatype.TestDatatypePickle.testIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_datatype.TestDatatypePickle.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNamed (test_datatype.TestDatatypePickle.testNamed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testResized (test_datatype.TestDatatypePickle.testResized) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testNamed (test_datatype.TestDatatypePickle.testNamed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testResized (test_datatype.TestDatatypePickle.testResized) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexedBlock (test_datatype.TestDatatypePickle.testIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNamed (test_datatype.TestDatatypePickle.testNamed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testResized (test_datatype.TestDatatypePickle.testResized) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_datatype.TestDatatypePickle.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_datatype.TestDatatypePickle.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_datatype.TestDatatypePickle.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_datatype.TestDatatypePickle.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_datatype.TestDatatypePickle.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_datatype.TestDatatypePickle.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_datatype.TestDatatypePickle.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_datatype.TestDatatypePickle.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testValueIndex (test_datatype.TestDatatypePickle.testValueIndex) ... ok testVector (test_datatype.TestDatatypePickle.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testValueIndex (test_datatype.TestDatatypePickle.testValueIndex) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testVector (test_datatype.TestDatatypePickle.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testValueIndex (test_datatype.TestDatatypePickle.testValueIndex) ... ok testVector (test_datatype.TestDatatypePickle.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testValueIndex (test_datatype.TestDatatypePickle.testValueIndex) ... ok testVector (test_datatype.TestDatatypePickle.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] ok testDoc (test_doc.TestDoc.testDoc) ... Sending upstream hdr.cmd = CMD_STDERR ok testAcceptConnect (test_dynproc.TestDPM.testAcceptConnect) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDoc (test_doc.TestDoc.testDoc) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcceptConnect (test_dynproc.TestDPM.testAcceptConnect) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDoc (test_doc.TestDoc.testDoc) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcceptConnect (test_dynproc.TestDPM.testAcceptConnect) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDoc (test_doc.TestDoc.testDoc) ... [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE ok testAcceptConnect (test_dynproc.TestDPM.testAcceptConnect) ... [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE ok testConnectAccept (test_dynproc.TestDPM.testConnectAccept) ... ok testConnectAccept (test_dynproc.TestDPM.testConnectAccept) ... ok testConnectAccept (test_dynproc.TestDPM.testConnectAccept) ... ok testConnectAccept (test_dynproc.TestDPM.testConnectAccept) ... [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testJoin (test_dynproc.TestDPM.testJoin) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testJoin (test_dynproc.TestDPM.testJoin) ... ok testJoin (test_dynproc.TestDPM.testJoin) ... ok testJoin (test_dynproc.TestDPM.testJoin) ... [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] cmd=publish_name service=mpi4py-0 port=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] cmd=publish_name service=mpi4py-1 port=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ [proxy:0@virt32a] [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=publish_name service=mpi4py-0 port=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=publish_result rc=0 Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] cmd=publish_name service=mpi4py-2 port=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] cmd=publish_name service=mpi4py-3 port=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=publish_name service=mpi4py-1 port=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=publish_result rc=0 [proxy:0@virt32a] we don't understand the response publish_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] cmd=lookup_name service=mpi4py-0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=publish_name service=mpi4py-2 port=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=publish_result rc=0 [proxy:0@virt32a] we don't understand the response publish_result; forwarding downstream [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=publish_name service=mpi4py-3 port=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=publish_result rc=0 [proxy:0@virt32a] we don't understand the response publish_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] cmd=lookup_name service=mpi4py-2 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI ok testNamePublishing (test_dynproc.TestDPM.testNamePublishing) ... [proxy:0@virt32a] we don't understand the response publish_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] cmd=lookup_name service=mpi4py-3 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI ok testNamePublishing (test_dynproc.TestDPM.testNamePublishing) ... ok testNamePublishing (test_dynproc.TestDPM.testNamePublishing) ... ok testNamePublishing (test_dynproc.TestDPM.testNamePublishing) ... [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=lookup_name service=mpi4py-0 [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=lookup_result rc=0 port=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=lookup_name service=mpi4py-2 [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=lookup_result rc=0 port=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ [proxy:0@virt32a] we don't understand the response lookup_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] cmd=unpublish_name service=mpi4py-0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=lookup_name service=mpi4py-3 [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=lookup_result rc=0 port=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ [proxy:0@virt32a] we don't understand the response lookup_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] cmd=unpublish_name service=mpi4py-2 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=unpublish_name service=mpi4py-0 [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=unpublish_result rc=0 [proxy:0@virt32a] we don't understand the response lookup_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] cmd=unpublish_name service=mpi4py-3 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=unpublish_name service=mpi4py-2 [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=unpublish_result rc=0 [proxy:0@virt32a] we don't understand the response unpublish_result; forwarding downstream [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=unpublish_name service=mpi4py-3 [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=unpublish_result rc=0 [proxy:0@virt32a] we don't understand the response unpublish_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] cmd=lookup_name service=mpi4py-1 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] we don't understand the response unpublish_result; forwarding downstream [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=lookup_name service=mpi4py-1 [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=lookup_result rc=0 port=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get_universe_size [proxy:0@virt32a] Sending PMI command: cmd=universe_size rc=0 size=-1 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetHWResourceInfo (test_environ.TestEnviron.testGetHWResourceInfo) ... ok testGetLibraryVersion (test_environ.TestEnviron.testGetLibraryVersion) ... ok testGetProcessorName (test_environ.TestEnviron.testGetProcessorName) ... ok testGetVersion (test_environ.TestEnviron.testGetVersion) ... ok testIsFinalized (test_environ.TestEnviron.testIsFinalized) ... ok testIsInitialized (test_environ.TestEnviron.testIsInitialized) ... ok testPControl (test_environ.TestEnviron.testPControl) ... ok testWTick (test_environ.TestEnviron.testWTick) ... [proxy:0@virt32a] we don't understand the response lookup_result; forwarding downstream [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWTime (test_environ.TestEnviron.testWTime) ... ok ok testGetHWResourceInfo (test_environ.TestEnviron.testGetHWResourceInfo) ... ok [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] cmd=unpublish_name service=mpi4py-1 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get_universe_size [proxy:0@virt32a] Sending PMI command: cmd=universe_size rc=0 size=-1 [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get_universe_size [proxy:0@virt32a] Sending PMI command: cmd=universe_size rc=0 size=-1 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAppNum (test_environ.TestWorldAttrs.testAppNum) ... ok testGetLibraryVersion (test_environ.TestEnviron.testGetLibraryVersion) ... testGetHWResourceInfo (test_environ.TestEnviron.testGetHWResourceInfo) ... ok ok testIOProcessor (test_environ.TestWorldAttrs.testIOProcessor) ... ok testLastUsedCode (test_environ.TestWorldAttrs.testLastUsedCode) ... ok testGetProcessorName (test_environ.TestEnviron.testGetProcessorName) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetLibraryVersion (test_environ.TestEnviron.testGetLibraryVersion) ... ok testGetProcessorName (test_environ.TestEnviron.testGetProcessorName) ... ok ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testUniverseSize (test_environ.TestWorldAttrs.testUniverseSize) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetVersion (test_environ.TestEnviron.testGetVersion) ... ok testIsFinalized (test_environ.TestEnviron.testIsFinalized) ... ok testGetVersion (test_environ.TestEnviron.testGetVersion) ... ok testIsFinalized (test_environ.TestEnviron.testIsFinalized) ... ok ok testWTimeIsGlobal (test_environ.TestWorldAttrs.testWTimeIsGlobal) ... testIsInitialized (test_environ.TestEnviron.testIsInitialized) ... ok testPControl (test_environ.TestEnviron.testPControl) ... ok testIsInitialized (test_environ.TestEnviron.testIsInitialized) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testPControl (test_environ.TestEnviron.testPControl) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testPickle (test_errhandler.TestErrhandler.testPickle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testWTick (test_environ.TestEnviron.testWTick) ... ok testWTime (test_environ.TestEnviron.testWTime) ... ok testAppNum (test_environ.TestWorldAttrs.testAppNum) ... ok testIOProcessor (test_environ.TestWorldAttrs.testIOProcessor) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWTick (test_environ.TestEnviron.testWTick) ... ok testWTime (test_environ.TestEnviron.testWTime) ... ok testAppNum (test_environ.TestWorldAttrs.testAppNum) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: cmd=unpublish_name service=mpi4py-1 [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=unpublish_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testLastUsedCode (test_environ.TestWorldAttrs.testLastUsedCode) ... ok testUniverseSize (test_environ.TestWorldAttrs.testUniverseSize) ... [proxy:0@virt32a] we don't understand the response unpublish_result; forwarding downstream [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get_universe_size [proxy:0@virt32a] Sending PMI command: cmd=universe_size rc=0 size=-1 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testIOProcessor (test_environ.TestWorldAttrs.testIOProcessor) ... ok testLastUsedCode (test_environ.TestWorldAttrs.testLastUsedCode) ... ok testUniverseSize (test_environ.TestWorldAttrs.testUniverseSize) ... ok testWTimeIsGlobal (test_environ.TestWorldAttrs.testWTimeIsGlobal) ... ok testPickle (test_errhandler.TestErrhandler.testPickle) ... ok testWTimeIsGlobal (test_environ.TestWorldAttrs.testWTimeIsGlobal) ... ok testPickle (test_errhandler.TestErrhandler.testPickle) ... ok testPredefined (test_errhandler.TestErrhandler.testPredefined) ... ok testCall (test_errhandler.TestErrhandlerComm.testCall) ... ok testCreate (test_errhandler.TestErrhandlerComm.testCreate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPredefined (test_errhandler.TestErrhandler.testPredefined) ... testPredefined (test_errhandler.TestErrhandler.testPredefined) ... ok ok testCall (test_errhandler.TestErrhandlerComm.testCall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCall (test_errhandler.TestErrhandlerComm.testCall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testCreate (test_errhandler.TestErrhandlerComm.testCreate) ... ok testCreate (test_errhandler.TestErrhandlerComm.testCreate) ... testErrorsAbort (test_errhandler.TestErrhandlerComm.testErrorsAbort) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrorsFatal (test_errhandler.TestErrhandlerComm.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerComm.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerComm.testGetFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCall (test_errhandler.TestErrhandlerFile.testCall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testErrorsAbort (test_errhandler.TestErrhandlerComm.testErrorsAbort) ... testErrorsAbort (test_errhandler.TestErrhandlerComm.testErrorsAbort) ... ok testErrorsFatal (test_errhandler.TestErrhandlerComm.testErrorsFatal) ... ok ok testErrorsFatal (test_errhandler.TestErrhandlerComm.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerComm.testErrorsReturn) ... testErrorsReturn (test_errhandler.TestErrhandlerComm.testErrorsReturn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCreate (test_errhandler.TestErrhandlerFile.testCreate) ... ok testErrorsAbort (test_errhandler.TestErrhandlerFile.testErrorsAbort) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetHWResourceInfo (test_environ.TestEnviron.testGetHWResourceInfo) ... ok testGetLibraryVersion (test_environ.TestEnviron.testGetLibraryVersion) ... ok testGetProcessorName (test_environ.TestEnviron.testGetProcessorName) ... ok testGetVersion (test_environ.TestEnviron.testGetVersion) ... ok testIsFinalized (test_environ.TestEnviron.testIsFinalized) ... ok testIsInitialized (test_environ.TestEnviron.testIsInitialized) ... ok testPControl (test_environ.TestEnviron.testPControl) ... ok testWTick (test_environ.TestEnviron.testWTick) ... ok testWTime (test_environ.TestEnviron.testWTime) ... ok testAppNum (test_environ.TestWorldAttrs.testAppNum) ... ok testIOProcessor (test_environ.TestWorldAttrs.testIOProcessor) ... ok testLastUsedCode (test_environ.TestWorldAttrs.testLastUsedCode) ... ok testUniverseSize (test_environ.TestWorldAttrs.testUniverseSize) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetFree (test_errhandler.TestErrhandlerComm.testGetFree) ... ok testCall (test_errhandler.TestErrhandlerFile.testCall) ... ok testCreate (test_errhandler.TestErrhandlerFile.testCreate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetFree (test_errhandler.TestErrhandlerComm.testGetFree) ... ok testCall (test_errhandler.TestErrhandlerFile.testCall) ... ok testCreate (test_errhandler.TestErrhandlerFile.testCreate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrorsFatal (test_errhandler.TestErrhandlerFile.testErrorsFatal) ... ok testErrorsAbort (test_errhandler.TestErrhandlerFile.testErrorsAbort) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrorsAbort (test_errhandler.TestErrhandlerFile.testErrorsAbort) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testErrorsReturn (test_errhandler.TestErrhandlerFile.testErrorsReturn) ... ok ok testErrorsFatal (test_errhandler.TestErrhandlerFile.testErrorsFatal) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrorsFatal (test_errhandler.TestErrhandlerFile.testErrorsFatal) ... testGetFree (test_errhandler.TestErrhandlerFile.testGetFree) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrorsReturn (test_errhandler.TestErrhandlerFile.testErrorsReturn) ... testErrorsReturn (test_errhandler.TestErrhandlerFile.testErrorsReturn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCall (test_errhandler.TestErrhandlerSession.testCall) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCreate (test_errhandler.TestErrhandlerSession.testCreate) ... ok testGetFree (test_errhandler.TestErrhandlerFile.testGetFree) ... testGetFree (test_errhandler.TestErrhandlerFile.testGetFree) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCall (test_errhandler.TestErrhandlerSession.testCall) ... testCall (test_errhandler.TestErrhandlerSession.testCall) ... ok testCreate (test_errhandler.TestErrhandlerSession.testCreate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCreate (test_errhandler.TestErrhandlerSession.testCreate) ... ok testErrorsAbort (test_errhandler.TestErrhandlerSession.testErrorsAbort) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testErrorsFatal (test_errhandler.TestErrhandlerSession.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerSession.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerSession.testGetFree) ... ok testCall (test_errhandler.TestErrhandlerWin.testCall) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testErrorsAbort (test_errhandler.TestErrhandlerSession.testErrorsAbort) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testErrorsAbort (test_errhandler.TestErrhandlerSession.testErrorsAbort) ... ok testCreate (test_errhandler.TestErrhandlerWin.testCreate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrorsFatal (test_errhandler.TestErrhandlerSession.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerSession.testErrorsReturn) ... ok ok testErrorsFatal (test_errhandler.TestErrhandlerSession.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerSession.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerSession.testGetFree) ... testGetFree (test_errhandler.TestErrhandlerSession.testGetFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrorsAbort (test_errhandler.TestErrhandlerWin.testErrorsAbort) ... ok testErrorsFatal (test_errhandler.TestErrhandlerWin.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerWin.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerWin.testGetFree) ... ok testAddErrorClass (test_errorcode.TestErrorCode.testAddErrorClass) ... ok testAddErrorClassCodeString (test_errorcode.TestErrorCode.testAddErrorClassCodeString) ... ok testAddErrorCode (test_errorcode.TestErrorCode.testAddErrorCode) ... ok testException (test_errorcode.TestErrorCode.testException) ... ok testGetErrorClass (test_errorcode.TestErrorCode.testGetErrorClass) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWTimeIsGlobal (test_environ.TestWorldAttrs.testWTimeIsGlobal) ... ok testPickle (test_errhandler.TestErrhandler.testPickle) ... ok testPredefined (test_errhandler.TestErrhandler.testPredefined) ... ok testCall (test_errhandler.TestErrhandlerComm.testCall) ... ok testCreate (test_errhandler.TestErrhandlerComm.testCreate) ... ok testErrorsAbort (test_errhandler.TestErrhandlerComm.testErrorsAbort) ... ok testErrorsFatal (test_errhandler.TestErrhandlerComm.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerComm.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerComm.testGetFree) ... ok testCall (test_errhandler.TestErrhandlerFile.testCall) ... ok testCreate (test_errhandler.TestErrhandlerFile.testCreate) ... ok testCall (test_errhandler.TestErrhandlerWin.testCall) ... ok testCreate (test_errhandler.TestErrhandlerWin.testCreate) ... ok testErrorsAbort (test_errhandler.TestErrhandlerWin.testErrorsAbort) ... ok testErrorsFatal (test_errhandler.TestErrhandlerWin.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerWin.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerWin.testGetFree) ... ok testAddErrorClass (test_errorcode.TestErrorCode.testAddErrorClass) ... ok testAddErrorClassCodeString (test_errorcode.TestErrorCode.testAddErrorClassCodeString) ... ok testAddErrorCode (test_errorcode.TestErrorCode.testAddErrorCode) ... ok testException (test_errorcode.TestErrorCode.testException) ... ok testCall (test_errhandler.TestErrhandlerWin.testCall) ... ok testCreate (test_errhandler.TestErrhandlerWin.testCreate) ... ok testErrorsAbort (test_errhandler.TestErrhandlerWin.testErrorsAbort) ... ok testErrorsFatal (test_errhandler.TestErrhandlerWin.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerWin.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerWin.testGetFree) ... ok testAddErrorClass (test_errorcode.TestErrorCode.testAddErrorClass) ... ok testAddErrorClassCodeString (test_errorcode.TestErrorCode.testAddErrorClassCodeString) ... ok testAddErrorCode (test_errorcode.TestErrorCode.testAddErrorCode) ... ok testException (test_errorcode.TestErrorCode.testException) ... ok testGetErrorStrings (test_errorcode.TestErrorCode.testGetErrorStrings) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFreeSelf (test_exceptions.TestExcComm.testFreeSelf) ... ok testFreeWorld (test_exceptions.TestExcComm.testFreeWorld) ... ok testGetErrorClass (test_errorcode.TestErrorCode.testGetErrorClass) ... ok testGetErrorClass (test_errorcode.TestErrorCode.testGetErrorClass) ... ok testKeyvalInvalid (test_exceptions.TestExcComm.testKeyvalInvalid) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetErrorStrings (test_errorcode.TestErrorCode.testGetErrorStrings) ... ok ok testGetErrorStrings (test_errorcode.TestErrorCode.testGetErrorStrings) ... ok testAccessors (test_exceptions.TestExcCommNull.testAccessors) ... testFreeSelf (test_exceptions.TestExcComm.testFreeSelf) ... ok testFreeSelf (test_exceptions.TestExcComm.testFreeSelf) ... ok testFreeWorld (test_exceptions.TestExcComm.testFreeWorld) ... ok ok testFreeWorld (test_exceptions.TestExcComm.testFreeWorld) ... ok testKeyvalInvalid (test_exceptions.TestExcComm.testKeyvalInvalid) ... testCompare (test_exceptions.TestExcCommNull.testCompare) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testKeyvalInvalid (test_exceptions.TestExcComm.testKeyvalInvalid) ... ok testDisconnect (test_exceptions.TestExcCommNull.testDisconnect) ... ok ok testAccessors (test_exceptions.TestExcCommNull.testAccessors) ... ok testAccessors (test_exceptions.TestExcCommNull.testAccessors) ... testFree (test_exceptions.TestExcCommNull.testFree) ... ok testGetAttr (test_exceptions.TestExcCommNull.testGetAttr) ... ok ok testCompare (test_exceptions.TestExcCommNull.testCompare) ... ok testCompare (test_exceptions.TestExcCommNull.testCompare) ... testGetErrhandler (test_exceptions.TestExcCommNull.testGetErrhandler) ... ok testInterNull (test_exceptions.TestExcCommNull.testInterNull) ... ok testDisconnect (test_exceptions.TestExcCommNull.testDisconnect) ... ok testFree (test_exceptions.TestExcCommNull.testFree) ... ok testDisconnect (test_exceptions.TestExcCommNull.testDisconnect) ... ok testFree (test_exceptions.TestExcCommNull.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIntraNull (test_exceptions.TestExcCommNull.testIntraNull) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetAttr (test_exceptions.TestExcCommNull.testGetAttr) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetAttr (test_exceptions.TestExcCommNull.testGetAttr) ... ok testGetErrhandler (test_exceptions.TestExcCommNull.testGetErrhandler) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetErrhandler (test_exceptions.TestExcCommNull.testGetErrhandler) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSetErrhandler (test_exceptions.TestExcCommNull.testSetErrhandler) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testInterNull (test_exceptions.TestExcCommNull.testInterNull) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testInterNull (test_exceptions.TestExcCommNull.testInterNull) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFreePredefined (test_exceptions.TestExcDatatype.testFreePredefined) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIntraNull (test_exceptions.TestExcCommNull.testIntraNull) ... ok testIntraNull (test_exceptions.TestExcCommNull.testIntraNull) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSetErrhandler (test_exceptions.TestExcCommNull.testSetErrhandler) ... ok testSetErrhandler (test_exceptions.TestExcCommNull.testSetErrhandler) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFreePredefined (test_exceptions.TestExcDatatype.testFreePredefined) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFreePredefined (test_exceptions.TestExcDatatype.testFreePredefined) ... ok testKeyvalInvalid (test_exceptions.TestExcDatatype.testKeyvalInvalid) ... ok testKeyvalInvalid (test_exceptions.TestExcDatatype.testKeyvalInvalid) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testKeyvalInvalid (test_exceptions.TestExcDatatype.testKeyvalInvalid) ... ok testCommit (test_exceptions.TestExcDatatypeNull.testCommit) ... ok testDup (test_exceptions.TestExcDatatypeNull.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_exceptions.TestExcDatatypeNull.testFree) ... ok ok ok testCommSelfSetErrhandler (test_exceptions.TestExcErrhandlerNull.testCommSelfSetErrhandler) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCommit (test_exceptions.TestExcDatatypeNull.testCommit) ... testCommit (test_exceptions.TestExcDatatypeNull.testCommit) ... ok testCommWorldSetErrhandler (test_exceptions.TestExcErrhandlerNull.testCommWorldSetErrhandler) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDup (test_exceptions.TestExcDatatypeNull.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testDup (test_exceptions.TestExcDatatypeNull.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFree (test_exceptions.TestExcErrhandlerNull.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok ok testFree (test_exceptions.TestExcDatatypeNull.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFree (test_exceptions.TestExcDatatypeNull.testFree) ... testAccessors (test_exceptions.TestExcGroupNull.testAccessors) ... ok ok testCommSelfSetErrhandler (test_exceptions.TestExcErrhandlerNull.testCommSelfSetErrhandler) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCommSelfSetErrhandler (test_exceptions.TestExcErrhandlerNull.testCommSelfSetErrhandler) ... ok testCompare (test_exceptions.TestExcGroupNull.testCompare) ... ok testCommWorldSetErrhandler (test_exceptions.TestExcErrhandlerNull.testCommWorldSetErrhandler) ... ok testCommWorldSetErrhandler (test_exceptions.TestExcErrhandlerNull.testCommWorldSetErrhandler) ... ok testFree (test_exceptions.TestExcErrhandlerNull.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_exceptions.TestExcErrhandlerNull.testFree) ... ok testDelete (test_exceptions.TestExcInfo.testDelete) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetNthKey (test_exceptions.TestExcInfo.testGetNthKey) ... ok testAccessors (test_exceptions.TestExcGroupNull.testAccessors) ... ok testCompare (test_exceptions.TestExcGroupNull.testCompare) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAccessors (test_exceptions.TestExcGroupNull.testAccessors) ... ok testCompare (test_exceptions.TestExcGroupNull.testCompare) ... testDelete (test_exceptions.TestExcInfoNull.testDelete) ... ok testDup (test_exceptions.TestExcInfoNull.testDup) ... ok testDelete (test_exceptions.TestExcInfo.testDelete) ... ok testDelete (test_exceptions.TestExcInfo.testDelete) ... ok testFree (test_exceptions.TestExcInfoNull.testFree) ... ok testGet (test_exceptions.TestExcInfoNull.testGet) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetNthKey (test_exceptions.TestExcInfo.testGetNthKey) ... ok ok testGetNthKey (test_exceptions.TestExcInfo.testGetNthKey) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetNKeys (test_exceptions.TestExcInfoNull.testGetNKeys) ... ok testDelete (test_exceptions.TestExcInfoNull.testDelete) ... ok testDup (test_exceptions.TestExcInfoNull.testDup) ... testDelete (test_exceptions.TestExcInfoNull.testDelete) ... ok testDup (test_exceptions.TestExcInfoNull.testDup) ... testGetNthKey (test_exceptions.TestExcInfoNull.testGetNthKey) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSet (test_exceptions.TestExcInfoNull.testSet) ... ok testTruth (test_exceptions.TestExcInfoNull.testTruth) ... ok testFree (test_exceptions.TestExcInfoNull.testFree) ... ok testGet (test_exceptions.TestExcInfoNull.testGet) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_exceptions.TestExcInfoNull.testFree) ... ok testGet (test_exceptions.TestExcInfoNull.testGet) ... ok ok testGetNKeys (test_exceptions.TestExcInfoNull.testGetNKeys) ... testGetNKeys (test_exceptions.TestExcInfoNull.testGetNKeys) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFreePredefined (test_exceptions.TestExcOp.testFreePredefined) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetNthKey (test_exceptions.TestExcInfoNull.testGetNthKey) ... ok testSet (test_exceptions.TestExcInfoNull.testSet) ... ok testTruth (test_exceptions.TestExcInfoNull.testTruth) ... ok testGetNthKey (test_exceptions.TestExcInfoNull.testGetNthKey) ... ok testSet (test_exceptions.TestExcInfoNull.testSet) ... ok testTruth (test_exceptions.TestExcInfoNull.testTruth) ... ok ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFreePredefined (test_exceptions.TestExcOp.testFreePredefined) ... testFreePredefined (test_exceptions.TestExcOp.testFreePredefined) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFree (test_exceptions.TestExcOpNull.testFree) ... ok testCancel (test_exceptions.TestExcRequestNull.testCancel) ... ok testFree (test_exceptions.TestExcRequestNull.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testFree (test_exceptions.TestExcOpNull.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFree (test_exceptions.TestExcOpNull.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCreateGroup (test_exceptions.TestExcSession.testCreateGroup) ... ok ok testCancel (test_exceptions.TestExcRequestNull.testCancel) ... ok testCancel (test_exceptions.TestExcRequestNull.testCancel) ... ok testGetNthPsetNeg (test_exceptions.TestExcSession.testGetNthPsetNeg) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFree (test_exceptions.TestExcRequestNull.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetNthPsetPos (test_exceptions.TestExcSession.testGetNthPsetPos) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFree (test_exceptions.TestExcRequestNull.testFree) ... ok testCreateGroup (test_exceptions.TestExcSession.testCreateGroup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCreateGroup (test_exceptions.TestExcSession.testCreateGroup) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetPsetInfo (test_exceptions.TestExcSession.testGetPsetInfo) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetNthPsetNeg (test_exceptions.TestExcSession.testGetNthPsetNeg) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCreateGroup (test_exceptions.TestExcSessionNull.testCreateGroup) ... ok testGetErrhandler (test_exceptions.TestExcSessionNull.testGetErrhandler) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetNthPsetNeg (test_exceptions.TestExcSession.testGetNthPsetNeg) ... ok testGetNthPsetPos (test_exceptions.TestExcSession.testGetNthPsetPos) ... ok ok testGetNthPsetPos (test_exceptions.TestExcSession.testGetNthPsetPos) ... ok testGetPsetInfo (test_exceptions.TestExcSession.testGetPsetInfo) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetInfo (test_exceptions.TestExcSessionNull.testGetInfo) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetPsetInfo (test_exceptions.TestExcSession.testGetPsetInfo) ... ok testCreateGroup (test_exceptions.TestExcSessionNull.testCreateGroup) ... ok testCreateGroup (test_exceptions.TestExcSessionNull.testCreateGroup) ... ok testGetNthPset (test_exceptions.TestExcSessionNull.testGetNthPset) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetErrhandler (test_exceptions.TestExcSessionNull.testGetErrhandler) ... ok testGetErrhandler (test_exceptions.TestExcSessionNull.testGetErrhandler) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetNumPsets (test_exceptions.TestExcSessionNull.testGetNumPsets) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetInfo (test_exceptions.TestExcSessionNull.testGetInfo) ... ok testGetInfo (test_exceptions.TestExcSessionNull.testGetInfo) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetPsetInfo (test_exceptions.TestExcSessionNull.testGetPsetInfo) ... ok testSetErrhandler (test_exceptions.TestExcSessionNull.testSetErrhandler) ... ok ok testGetNthPset (test_exceptions.TestExcSessionNull.testGetNthPset) ... ok testGetNumPsets (test_exceptions.TestExcSessionNull.testGetNumPsets) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetNthPset (test_exceptions.TestExcSessionNull.testGetNthPset) ... ok testGetNumPsets (test_exceptions.TestExcSessionNull.testGetNumPsets) ... ok testGetCount (test_exceptions.TestExcStatus.testGetCount) ... ok testGetElements (test_exceptions.TestExcStatus.testGetElements) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetPsetInfo (test_exceptions.TestExcSessionNull.testGetPsetInfo) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetPsetInfo (test_exceptions.TestExcSessionNull.testGetPsetInfo) ... ok testSetErrhandler (test_exceptions.TestExcSessionNull.testSetErrhandler) ... ok ok testSetElements (test_exceptions.TestExcStatus.testSetElements) ... ok testKeyvalInvalid (test_exceptions.TestExcWin.testKeyvalInvalid) ... testSetErrhandler (test_exceptions.TestExcSessionNull.testSetErrhandler) ... ok testGetCount (test_exceptions.TestExcStatus.testGetCount) ... ok testGetCount (test_exceptions.TestExcStatus.testGetCount) ... ok testGetElements (test_exceptions.TestExcStatus.testGetElements) ... ok testCallErrhandler (test_exceptions.TestExcWinNull.testCallErrhandler) ... ok testGetElements (test_exceptions.TestExcStatus.testGetElements) ... ok ok testSetElements (test_exceptions.TestExcStatus.testSetElements) ... ok testSetElements (test_exceptions.TestExcStatus.testSetElements) ... ok testFree (test_exceptions.TestExcWinNull.testFree) ... ok testGetErrhandler (test_exceptions.TestExcWinNull.testGetErrhandler) ... ok testKeyvalInvalid (test_exceptions.TestExcWin.testKeyvalInvalid) ... testKeyvalInvalid (test_exceptions.TestExcWin.testKeyvalInvalid) ... ok testCallErrhandler (test_exceptions.TestExcWinNull.testCallErrhandler) ... ok testSetErrhandler (test_exceptions.TestExcWinNull.testSetErrhandler) ... ok testCallErrhandler (test_exceptions.TestExcWinNull.testCallErrhandler) ... ok testFree (test_exceptions.TestExcWinNull.testFree) ... ok testGetSetErrhandler (test_file.TestFileNull.testGetSetErrhandler) ... ok ok testFree (test_exceptions.TestExcWinNull.testFree) ... ok testGetErrhandler (test_exceptions.TestExcWinNull.testGetErrhandler) ... ok testBytes (test_file.TestFilePath.testBytes) ... testGetErrhandler (test_exceptions.TestExcWinNull.testGetErrhandler) ... ok testSetErrhandler (test_exceptions.TestExcWinNull.testSetErrhandler) ... ok testSetErrhandler (test_exceptions.TestExcWinNull.testSetErrhandler) ... ok ok testGetSetErrhandler (test_file.TestFileNull.testGetSetErrhandler) ... testGetSetErrhandler (test_file.TestFileNull.testGetSetErrhandler) ... ok testBytes (test_file.TestFilePath.testBytes) ... ok testBytes (test_file.TestFilePath.testBytes) ... ok testGetAmode (test_file.TestFilePath.testGetAmode) ... ok testGetAmode (test_file.TestFilePath.testGetAmode) ... ok testGetByteOffset (test_file.TestFilePath.testGetByteOffset) ... ok testGetAmode (test_file.TestFilePath.testGetAmode) ... ok testGetByteOffset (test_file.TestFilePath.testGetByteOffset) ... ok testGetErrhandler (test_file.TestFilePath.testGetErrhandler) ... ok testGetByteOffset (test_file.TestFilePath.testGetByteOffset) ... ok ok testGetErrhandler (test_file.TestFilePath.testGetErrhandler) ... testGetGroup (test_file.TestFilePath.testGetGroup) ... ok testGetErrhandler (test_file.TestFilePath.testGetErrhandler) ... ok testGetSetAtomicity (test_file.TestFilePath.testGetSetAtomicity) ... ok testGetGroup (test_file.TestFilePath.testGetGroup) ... ok testGetGroup (test_file.TestFilePath.testGetGroup) ... ok ok testGetSetAtomicity (test_file.TestFilePath.testGetSetAtomicity) ... testGetSetInfo (test_file.TestFilePath.testGetSetInfo) ... ok testGetSetAtomicity (test_file.TestFilePath.testGetSetAtomicity) ... ok ok testGetSetInfo (test_file.TestFilePath.testGetSetInfo) ... testGetSetSize (test_file.TestFilePath.testGetSetSize) ... ok testGetSetInfo (test_file.TestFilePath.testGetSetInfo) ... ok testGetSetSize (test_file.TestFilePath.testGetSetSize) ... ok testGetSetView (test_file.TestFilePath.testGetSetView) ... ok testGetSetSize (test_file.TestFilePath.testGetSetSize) ... ok testGetSetView (test_file.TestFilePath.testGetSetView) ... ok testGetSetView (test_file.TestFilePath.testGetSetView) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetTypeExtent (test_file.TestFilePath.testGetTypeExtent) ... ok testPath (test_file.TestFilePath.testPath) ... ok testPickle (test_file.TestFilePath.testPickle) ... ok testErrorsAbort (test_errhandler.TestErrhandlerFile.testErrorsAbort) ... ok testErrorsFatal (test_errhandler.TestErrhandlerFile.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerFile.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerFile.testGetFree) ... ok testCall (test_errhandler.TestErrhandlerSession.testCall) ... ok testCreate (test_errhandler.TestErrhandlerSession.testCreate) ... ok testErrorsAbort (test_errhandler.TestErrhandlerSession.testErrorsAbort) ... ok testErrorsFatal (test_errhandler.TestErrhandlerSession.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerSession.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerSession.testGetFree) ... ok testCall (test_errhandler.TestErrhandlerWin.testCall) ... ok testCreate (test_errhandler.TestErrhandlerWin.testCreate) ... ok testGetTypeExtent (test_file.TestFilePath.testGetTypeExtent) ... ok testPath (test_file.TestFilePath.testPath) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetTypeExtent (test_file.TestFilePath.testGetTypeExtent) ... ok testPath (test_file.TestFilePath.testPath) ... ok testPickle (test_file.TestFilePath.testPickle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPreallocate (test_file.TestFilePath.testPreallocate) ... ok testErrorsAbort (test_errhandler.TestErrhandlerWin.testErrorsAbort) ... ok testErrorsFatal (test_errhandler.TestErrhandlerWin.testErrorsFatal) ... ok testErrorsReturn (test_errhandler.TestErrhandlerWin.testErrorsReturn) ... ok testGetFree (test_errhandler.TestErrhandlerWin.testGetFree) ... ok testAddErrorClass (test_errorcode.TestErrorCode.testAddErrorClass) ... ok testAddErrorClassCodeString (test_errorcode.TestErrorCode.testAddErrorClassCodeString) ... ok testAddErrorCode (test_errorcode.TestErrorCode.testAddErrorCode) ... ok testPickle (test_file.TestFilePath.testPickle) ... ok testPreallocate (test_file.TestFilePath.testPreallocate) ... ok testPreallocate (test_file.TestFilePath.testPreallocate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testException (test_errorcode.TestErrorCode.testException) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetErrorClass (test_errorcode.TestErrorCode.testGetErrorClass) ... ok testGetErrorStrings (test_errorcode.TestErrorCode.testGetErrorStrings) ... ok testFreeSelf (test_exceptions.TestExcComm.testFreeSelf) ... ok testFreeWorld (test_exceptions.TestExcComm.testFreeWorld) ... ok testKeyvalInvalid (test_exceptions.TestExcComm.testKeyvalInvalid) ... ok testAccessors (test_exceptions.TestExcCommNull.testAccessors) ... ok testCompare (test_exceptions.TestExcCommNull.testCompare) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDisconnect (test_exceptions.TestExcCommNull.testDisconnect) ... ok testFree (test_exceptions.TestExcCommNull.testFree) ... ok testGetAttr (test_exceptions.TestExcCommNull.testGetAttr) ... ok testGetErrhandler (test_exceptions.TestExcCommNull.testGetErrhandler) ... ok testInterNull (test_exceptions.TestExcCommNull.testInterNull) ... ok testIntraNull (test_exceptions.TestExcCommNull.testIntraNull) ... ok testSetErrhandler (test_exceptions.TestExcCommNull.testSetErrhandler) ... ok testFreePredefined (test_exceptions.TestExcDatatype.testFreePredefined) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testKeyvalInvalid (test_exceptions.TestExcDatatype.testKeyvalInvalid) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommit (test_exceptions.TestExcDatatypeNull.testCommit) ... ok testDup (test_exceptions.TestExcDatatypeNull.testDup) ... ok testFree (test_exceptions.TestExcDatatypeNull.testFree) ... ok testCommSelfSetErrhandler (test_exceptions.TestExcErrhandlerNull.testCommSelfSetErrhandler) ... ok testCommWorldSetErrhandler (test_exceptions.TestExcErrhandlerNull.testCommWorldSetErrhandler) ... ok testFree (test_exceptions.TestExcErrhandlerNull.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAccessors (test_exceptions.TestExcGroupNull.testAccessors) ... ok testCompare (test_exceptions.TestExcGroupNull.testCompare) ... ok testDelete (test_exceptions.TestExcInfo.testDelete) ... ok testGetNthKey (test_exceptions.TestExcInfo.testGetNthKey) ... ok testDelete (test_exceptions.TestExcInfoNull.testDelete) ... ok testDup (test_exceptions.TestExcInfoNull.testDup) ... ok testFree (test_exceptions.TestExcInfoNull.testFree) ... ok testGet (test_exceptions.TestExcInfoNull.testGet) ... ok testGetNKeys (test_exceptions.TestExcInfoNull.testGetNKeys) ... ok testGetNthKey (test_exceptions.TestExcInfoNull.testGetNthKey) ... ok testSet (test_exceptions.TestExcInfoNull.testSet) ... ok testTruth (test_exceptions.TestExcInfoNull.testTruth) ... ok testFreePredefined (test_exceptions.TestExcOp.testFreePredefined) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_exceptions.TestExcOpNull.testFree) ... ok testCancel (test_exceptions.TestExcRequestNull.testCancel) ... ok testFree (test_exceptions.TestExcRequestNull.testFree) ... ok testCreateGroup (test_exceptions.TestExcSession.testCreateGroup) ... ok testGetNthPsetNeg (test_exceptions.TestExcSession.testGetNthPsetNeg) ... ok testGetNthPsetPos (test_exceptions.TestExcSession.testGetNthPsetPos) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetPsetInfo (test_exceptions.TestExcSession.testGetPsetInfo) ... ok testCreateGroup (test_exceptions.TestExcSessionNull.testCreateGroup) ... ok testGetErrhandler (test_exceptions.TestExcSessionNull.testGetErrhandler) ... ok testGetInfo (test_exceptions.TestExcSessionNull.testGetInfo) ... ok testGetNthPset (test_exceptions.TestExcSessionNull.testGetNthPset) ... ok testGetNumPsets (test_exceptions.TestExcSessionNull.testGetNumPsets) ... ok testGetPsetInfo (test_exceptions.TestExcSessionNull.testGetPsetInfo) ... ok testSetErrhandler (test_exceptions.TestExcSessionNull.testSetErrhandler) ... ok testGetCount (test_exceptions.TestExcStatus.testGetCount) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetElements (test_exceptions.TestExcStatus.testGetElements) ... ok testSetElements (test_exceptions.TestExcStatus.testSetElements) ... ok testKeyvalInvalid (test_exceptions.TestExcWin.testKeyvalInvalid) ... ok testCallErrhandler (test_exceptions.TestExcWinNull.testCallErrhandler) ... ok testFree (test_exceptions.TestExcWinNull.testFree) ... ok testGetErrhandler (test_exceptions.TestExcWinNull.testGetErrhandler) ... ok testSetErrhandler (test_exceptions.TestExcWinNull.testSetErrhandler) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetSetErrhandler (test_file.TestFileNull.testGetSetErrhandler) ... ok testBytes (test_file.TestFilePath.testBytes) ... ok testGetAmode (test_file.TestFilePath.testGetAmode) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetByteOffset (test_file.TestFilePath.testGetByteOffset) ... ok testGetErrhandler (test_file.TestFilePath.testGetErrhandler) ... ok testGetGroup (test_file.TestFilePath.testGetGroup) ... ok testGetSetAtomicity (test_file.TestFilePath.testGetSetAtomicity) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetInfo (test_file.TestFilePath.testGetSetInfo) ... ok testGetSetSize (test_file.TestFilePath.testGetSetSize) ... ok testGetSetView (test_file.TestFilePath.testGetSetView) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_file.TestFilePath.testPyProps) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_file.TestFilePath.testPyProps) ... ok testSeekGetPosition (test_file.TestFilePath.testSeekGetPosition) ... ok testSeekGetPositionShared (test_file.TestFilePath.testSeekGetPositionShared) ... ok testStr (test_file.TestFilePath.testStr) ... ok testGetTypeExtent (test_file.TestFilePath.testGetTypeExtent) ... ok testPath (test_file.TestFilePath.testPath) ... ok testPyProps (test_file.TestFilePath.testPyProps) ... ok testSeekGetPosition (test_file.TestFilePath.testSeekGetPosition) ... ok testSeekGetPositionShared (test_file.TestFilePath.testSeekGetPositionShared) ... ok testStr (test_file.TestFilePath.testStr) ... ok testSeekGetPosition (test_file.TestFilePath.testSeekGetPosition) ... ok testSeekGetPositionShared (test_file.TestFilePath.testSeekGetPositionShared) ... ok testStr (test_file.TestFilePath.testStr) ... ok testSync (test_file.TestFilePath.testSync) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSync (test_file.TestFilePath.testSync) ... ok testGetAmode (test_file.TestFileSelf.testGetAmode) ... ok testGetByteOffset (test_file.TestFileSelf.testGetByteOffset) ... ok testGetErrhandler (test_file.TestFileSelf.testGetErrhandler) ... ok testGetGroup (test_file.TestFileSelf.testGetGroup) ... ok testGetSetAtomicity (test_file.TestFileSelf.testGetSetAtomicity) ... ok testPickle (test_file.TestFilePath.testPickle) ... ok testPreallocate (test_file.TestFilePath.testPreallocate) ... testSync (test_file.TestFilePath.testSync) ... ok testGetAmode (test_file.TestFileSelf.testGetAmode) ... ok testGetByteOffset (test_file.TestFileSelf.testGetByteOffset) ... ok testGetErrhandler (test_file.TestFileSelf.testGetErrhandler) ... ok testGetGroup (test_file.TestFileSelf.testGetGroup) ... ok testGetSetAtomicity (test_file.TestFileSelf.testGetSetAtomicity) ... ok testGetSetInfo (test_file.TestFileSelf.testGetSetInfo) ... ok testGetAmode (test_file.TestFileSelf.testGetAmode) ... ok testGetByteOffset (test_file.TestFileSelf.testGetByteOffset) ... ok testGetErrhandler (test_file.TestFileSelf.testGetErrhandler) ... ok testGetGroup (test_file.TestFileSelf.testGetGroup) ... ok testGetSetAtomicity (test_file.TestFileSelf.testGetSetAtomicity) ... ok testGetSetInfo (test_file.TestFileSelf.testGetSetInfo) ... ok testGetSetInfo (test_file.TestFileSelf.testGetSetInfo) ... ok testGetSetSize (test_file.TestFileSelf.testGetSetSize) ... ok testGetSetView (test_file.TestFileSelf.testGetSetView) ... ok testGetSetSize (test_file.TestFileSelf.testGetSetSize) ... ok testGetSetView (test_file.TestFileSelf.testGetSetView) ... ok testGetSetSize (test_file.TestFileSelf.testGetSetSize) ... ok testGetSetView (test_file.TestFileSelf.testGetSetView) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetTypeExtent (test_file.TestFileSelf.testGetTypeExtent) ... ok ok testGetTypeExtent (test_file.TestFileSelf.testGetTypeExtent) ... ok testPickle (test_file.TestFileSelf.testPickle) ... ok testGetTypeExtent (test_file.TestFileSelf.testGetTypeExtent) ... ok testPickle (test_file.TestFileSelf.testPickle) ... ok testPreallocate (test_file.TestFileSelf.testPreallocate) ... testPickle (test_file.TestFileSelf.testPickle) ... ok testPreallocate (test_file.TestFileSelf.testPreallocate) ... ok testPreallocate (test_file.TestFileSelf.testPreallocate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_file.TestFilePath.testPyProps) ... ok testSeekGetPosition (test_file.TestFilePath.testSeekGetPosition) ... ok testSeekGetPositionShared (test_file.TestFilePath.testSeekGetPositionShared) ... ok testStr (test_file.TestFilePath.testStr) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSync (test_file.TestFilePath.testSync) ... ok testGetAmode (test_file.TestFileSelf.testGetAmode) ... ok testGetByteOffset (test_file.TestFileSelf.testGetByteOffset) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetErrhandler (test_file.TestFileSelf.testGetErrhandler) ... ok testGetGroup (test_file.TestFileSelf.testGetGroup) ... ok testGetSetAtomicity (test_file.TestFileSelf.testGetSetAtomicity) ... ok testGetSetInfo (test_file.TestFileSelf.testGetSetInfo) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetSize (test_file.TestFileSelf.testGetSetSize) ... ok testGetSetView (test_file.TestFileSelf.testGetSetView) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_file.TestFileSelf.testPyProps) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_file.TestFileSelf.testPyProps) ... ok testSeekGetPosition (test_file.TestFileSelf.testSeekGetPosition) ... ok testSeekGetPositionShared (test_file.TestFileSelf.testSeekGetPositionShared) ... ok testSync (test_file.TestFileSelf.testSync) ... ok testPyProps (test_file.TestFileSelf.testPyProps) ... ok testSeekGetPosition (test_file.TestFileSelf.testSeekGetPosition) ... ok testSeekGetPositionShared (test_file.TestFileSelf.testSeekGetPositionShared) ... ok testSync (test_file.TestFileSelf.testSync) ... ok testSeekGetPosition (test_file.TestFileSelf.testSeekGetPosition) ... ok testSeekGetPositionShared (test_file.TestFileSelf.testSeekGetPositionShared) ... ok testSync (test_file.TestFileSelf.testSync) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFortran (test_fortran.TestFortranComm.testFortran) ... ok testGetTypeExtent (test_file.TestFileSelf.testGetTypeExtent) ... ok testPickle (test_file.TestFileSelf.testPickle) ... ok testFortran (test_fortran.TestFortranComm.testFortran) ... testFortran (test_fortran.TestFortranComm.testFortran) ... ok testPreallocate (test_file.TestFileSelf.testPreallocate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFortran (test_fortran.TestFortranDatatype.testFortran) ... ok testFortran (test_fortran.TestFortranErrhandler.testFortran) ... ok testFortran (test_fortran.TestFortranFile.testFortran) ... ok testFortran (test_fortran.TestFortranGroup.testFortran) ... ok testFortran (test_fortran.TestFortranInfo.testFortran) ... ok testFortran (test_fortran.TestFortranMessage.testFortran) ... ok testFortran (test_fortran.TestFortranOp.testFortran) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_file.TestFileSelf.testPyProps) ... ok testSeekGetPosition (test_file.TestFileSelf.testSeekGetPosition) ... ok testSeekGetPositionShared (test_file.TestFileSelf.testSeekGetPositionShared) ... ok testSync (test_file.TestFileSelf.testSync) ... ok testFortran (test_fortran.TestFortranComm.testFortran) ... ok testFortran (test_fortran.TestFortranDatatype.testFortran) ... ok testFortran (test_fortran.TestFortranErrhandler.testFortran) ... ok testFortran (test_fortran.TestFortranFile.testFortran) ... ok testFortran (test_fortran.TestFortranGroup.testFortran) ... ok testFortran (test_fortran.TestFortranInfo.testFortran) ... ok testFortran (test_fortran.TestFortranMessage.testFortran) ... ok testFortran (test_fortran.TestFortranOp.testFortran) ... ok testFortran (test_fortran.TestFortranRequest.testFortran) ... ok testFortran (test_fortran.TestFortranDatatype.testFortran) ... ok testFortran (test_fortran.TestFortranErrhandler.testFortran) ... ok testFortran (test_fortran.TestFortranFile.testFortran) ... ok testFortran (test_fortran.TestFortranGroup.testFortran) ... ok testFortran (test_fortran.TestFortranInfo.testFortran) ... ok testFortran (test_fortran.TestFortranMessage.testFortran) ... ok testFortran (test_fortran.TestFortranOp.testFortran) ... ok testFortran (test_fortran.TestFortranRequest.testFortran) ... ok testFortran (test_fortran.TestFortranDatatype.testFortran) ... ok testFortran (test_fortran.TestFortranErrhandler.testFortran) ... ok testFortran (test_fortran.TestFortranFile.testFortran) ... ok testFortran (test_fortran.TestFortranGroup.testFortran) ... ok testFortran (test_fortran.TestFortranInfo.testFortran) ... ok testFortran (test_fortran.TestFortranMessage.testFortran) ... ok testFortran (test_fortran.TestFortranOp.testFortran) ... ok testFortran (test_fortran.TestFortranRequest.testFortran) ... testFortran (test_fortran.TestFortranRequest.testFortran) ... ok testFortran (test_fortran.TestFortranSession.testFortran) ... ok testFintArray (test_fortran.TestFortranStatus.testFintArray) ... ok testFortran (test_fortran.TestFortranStatus.testFortran) ... ok testFortran (test_fortran.TestFortranWin.testFortran) ... ok testAll (test_grequest.TestGrequest.testAll) ... ok testAll1 (test_grequest.TestGrequest.testAll1) ... ok testAll2 (test_grequest.TestGrequest.testAll2) ... ok testConstructor (test_grequest.TestGrequest.testConstructor) ... ok testExceptionHandling (test_grequest.TestGrequest.testExceptionHandling) ... ok testFortran (test_fortran.TestFortranSession.testFortran) ... ok testFintArray (test_fortran.TestFortranStatus.testFintArray) ... ok testFortran (test_fortran.TestFortranStatus.testFortran) ... ok testFortran (test_fortran.TestFortranWin.testFortran) ... ok testAll (test_grequest.TestGrequest.testAll) ... ok testAll1 (test_grequest.TestGrequest.testAll1) ... ok testAll2 (test_grequest.TestGrequest.testAll2) ... ok testConstructor (test_grequest.TestGrequest.testConstructor) ... ok testExceptionHandling (test_grequest.TestGrequest.testExceptionHandling) ... ok testFortran (test_fortran.TestFortranSession.testFortran) ... ok testFintArray (test_fortran.TestFortranStatus.testFintArray) ... ok testFortran (test_fortran.TestFortranStatus.testFortran) ... ok testFortran (test_fortran.TestFortranWin.testFortran) ... ok testAll (test_grequest.TestGrequest.testAll) ... ok testAll1 (test_grequest.TestGrequest.testAll1) ... ok testAll2 (test_grequest.TestGrequest.testAll2) ... ok testConstructor (test_grequest.TestGrequest.testConstructor) ... ok testExceptionHandling (test_grequest.TestGrequest.testExceptionHandling) ... ok testFortran (test_fortran.TestFortranSession.testFortran) ... ok testFintArray (test_fortran.TestFortranStatus.testFintArray) ... ok testFortran (test_fortran.TestFortranStatus.testFortran) ... ok testFortran (test_fortran.TestFortranWin.testFortran) ... ok testAll (test_grequest.TestGrequest.testAll) ... ok testAll1 (test_grequest.TestGrequest.testAll1) ... ok testAll2 (test_grequest.TestGrequest.testAll2) ... ok testConstructor (test_grequest.TestGrequest.testConstructor) ... ok testExceptionHandling (test_grequest.TestGrequest.testExceptionHandling) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyCompleteTest (test_grequest.TestGrequest.testPyCompleteTest) ... ok testPyCompleteWait (test_grequest.TestGrequest.testPyCompleteWait) ... ok testCompare (test_group.TestGroupEmpty.testCompare) ... ok testDifference (test_group.TestGroupEmpty.testDifference) ... ok testDup (test_group.TestGroupEmpty.testDup) ... ok testEmpty (test_group.TestGroupEmpty.testEmpty) ... ok testExcl (test_group.TestGroupEmpty.testExcl) ... ok testIncl (test_group.TestGroupEmpty.testIncl) ... ok testIntersection (test_group.TestGroupEmpty.testIntersection) ... ok testPickle (test_group.TestGroupEmpty.testPickle) ... ok testProperties (test_group.TestGroupEmpty.testProperties) ... ok testRangeExcl (test_group.TestGroupEmpty.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupEmpty.testRangeIncl) ... ok testRank (test_group.TestGroupEmpty.testRank) ... ok testSize (test_group.TestGroupEmpty.testSize) ... ok testTranslRanks (test_group.TestGroupEmpty.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupEmpty.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupEmpty.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupEmpty.testUnion) ... ok testConstructor (test_group.TestGroupNull.testConstructor) ... ok testNull (test_group.TestGroupNull.testNull) ... ok testPickle (test_group.TestGroupNull.testPickle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyCompleteTest (test_grequest.TestGrequest.testPyCompleteTest) ... ok testPyCompleteWait (test_grequest.TestGrequest.testPyCompleteWait) ... ok testCompare (test_group.TestGroupEmpty.testCompare) ... ok testDifference (test_group.TestGroupEmpty.testDifference) ... ok testDup (test_group.TestGroupEmpty.testDup) ... ok testEmpty (test_group.TestGroupEmpty.testEmpty) ... ok testExcl (test_group.TestGroupEmpty.testExcl) ... ok testIncl (test_group.TestGroupEmpty.testIncl) ... ok testIntersection (test_group.TestGroupEmpty.testIntersection) ... ok testPickle (test_group.TestGroupEmpty.testPickle) ... ok testProperties (test_group.TestGroupEmpty.testProperties) ... ok testRangeExcl (test_group.TestGroupEmpty.testRangeExcl) ... ok ok testPyCompleteTest (test_grequest.TestGrequest.testPyCompleteTest) ... ok testPyCompleteWait (test_grequest.TestGrequest.testPyCompleteWait) ... ok testCompare (test_group.TestGroupEmpty.testCompare) ... ok testDifference (test_group.TestGroupEmpty.testDifference) ... ok testDup (test_group.TestGroupEmpty.testDup) ... ok testEmpty (test_group.TestGroupEmpty.testEmpty) ... ok testExcl (test_group.TestGroupEmpty.testExcl) ... ok testIncl (test_group.TestGroupEmpty.testIncl) ... ok testIntersection (test_group.TestGroupEmpty.testIntersection) ... ok testPickle (test_group.TestGroupEmpty.testPickle) ... ok testProperties (test_group.TestGroupEmpty.testProperties) ... ok testRangeExcl (test_group.TestGroupEmpty.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupEmpty.testRangeIncl) ... ok testRank (test_group.TestGroupEmpty.testRank) ... ok testSize (test_group.TestGroupEmpty.testSize) ... ok testTranslRanks (test_group.TestGroupEmpty.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupEmpty.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupEmpty.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupEmpty.testUnion) ... ok testConstructor (test_group.TestGroupNull.testConstructor) ... ok testNull (test_group.TestGroupNull.testNull) ... ok testPickle (test_group.TestGroupNull.testPickle) ... ok testPyCompleteTest (test_grequest.TestGrequest.testPyCompleteTest) ... ok testPyCompleteWait (test_grequest.TestGrequest.testPyCompleteWait) ... ok testCompare (test_group.TestGroupEmpty.testCompare) ... ok testDifference (test_group.TestGroupEmpty.testDifference) ... ok testDup (test_group.TestGroupEmpty.testDup) ... ok testEmpty (test_group.TestGroupEmpty.testEmpty) ... ok testExcl (test_group.TestGroupEmpty.testExcl) ... ok testIncl (test_group.TestGroupEmpty.testIncl) ... ok testIntersection (test_group.TestGroupEmpty.testIntersection) ... ok testPickle (test_group.TestGroupEmpty.testPickle) ... ok testProperties (test_group.TestGroupEmpty.testProperties) ... ok testRangeExcl (test_group.TestGroupEmpty.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupEmpty.testRangeIncl) ... ok testRank (test_group.TestGroupEmpty.testRank) ... ok testSize (test_group.TestGroupEmpty.testSize) ... ok testTranslRanks (test_group.TestGroupEmpty.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupEmpty.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupEmpty.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupEmpty.testUnion) ... ok testConstructor (test_group.TestGroupNull.testConstructor) ... ok testNull (test_group.TestGroupNull.testNull) ... ok testPickle (test_group.TestGroupNull.testPickle) ... testRangeIncl (test_group.TestGroupEmpty.testRangeIncl) ... ok testCompare (test_group.TestGroupSelf.testCompare) ... ok testDifference (test_group.TestGroupSelf.testDifference) ... ok testDup (test_group.TestGroupSelf.testDup) ... ok testExcl (test_group.TestGroupSelf.testExcl) ... ok testIncl (test_group.TestGroupSelf.testIncl) ... ok testIntersection (test_group.TestGroupSelf.testIntersection) ... ok testPickle (test_group.TestGroupSelf.testPickle) ... ok testProperties (test_group.TestGroupSelf.testProperties) ... ok testRangeExcl (test_group.TestGroupSelf.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupSelf.testRangeIncl) ... ok testRank (test_group.TestGroupSelf.testRank) ... ok testSize (test_group.TestGroupSelf.testSize) ... ok testTranslRanks (test_group.TestGroupSelf.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupSelf.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupSelf.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupSelf.testUnion) ... ok testCompare (test_group.TestGroupWorld.testCompare) ... ok testDifference (test_group.TestGroupWorld.testDifference) ... ok testDup (test_group.TestGroupWorld.testDup) ... ok testExcl (test_group.TestGroupWorld.testExcl) ... ok testIncl (test_group.TestGroupWorld.testIncl) ... ok testIntersection (test_group.TestGroupWorld.testIntersection) ... ok testPickle (test_group.TestGroupWorld.testPickle) ... ok testRank (test_group.TestGroupEmpty.testRank) ... ok testSize (test_group.TestGroupEmpty.testSize) ... ok testTranslRanks (test_group.TestGroupEmpty.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupEmpty.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupEmpty.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupEmpty.testUnion) ... ok testConstructor (test_group.TestGroupNull.testConstructor) ... ok testNull (test_group.TestGroupNull.testNull) ... ok testPickle (test_group.TestGroupNull.testPickle) ... ok testCompare (test_group.TestGroupSelf.testCompare) ... ok testDifference (test_group.TestGroupSelf.testDifference) ... ok testDup (test_group.TestGroupSelf.testDup) ... ok testExcl (test_group.TestGroupSelf.testExcl) ... ok testIncl (test_group.TestGroupSelf.testIncl) ... ok testIntersection (test_group.TestGroupSelf.testIntersection) ... ok testPickle (test_group.TestGroupSelf.testPickle) ... ok testProperties (test_group.TestGroupSelf.testProperties) ... ok testRangeExcl (test_group.TestGroupSelf.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupSelf.testRangeIncl) ... ok testRank (test_group.TestGroupSelf.testRank) ... ok testSize (test_group.TestGroupSelf.testSize) ... ok testTranslRanks (test_group.TestGroupSelf.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupSelf.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupSelf.testTranslRanksProcNull) ... ok testCompare (test_group.TestGroupSelf.testCompare) ... ok testDifference (test_group.TestGroupSelf.testDifference) ... ok testDup (test_group.TestGroupSelf.testDup) ... ok testExcl (test_group.TestGroupSelf.testExcl) ... ok testIncl (test_group.TestGroupSelf.testIncl) ... ok testIntersection (test_group.TestGroupSelf.testIntersection) ... ok testPickle (test_group.TestGroupSelf.testPickle) ... ok testProperties (test_group.TestGroupSelf.testProperties) ... ok testRangeExcl (test_group.TestGroupSelf.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupSelf.testRangeIncl) ... ok testRank (test_group.TestGroupSelf.testRank) ... ok testSize (test_group.TestGroupSelf.testSize) ... ok testTranslRanks (test_group.TestGroupSelf.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupSelf.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupSelf.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupSelf.testUnion) ... ok testCompare (test_group.TestGroupWorld.testCompare) ... ok testDifference (test_group.TestGroupWorld.testDifference) ... ok testDup (test_group.TestGroupWorld.testDup) ... ok testExcl (test_group.TestGroupWorld.testExcl) ... ok testIncl (test_group.TestGroupWorld.testIncl) ... ok testIntersection (test_group.TestGroupWorld.testIntersection) ... ok testPickle (test_group.TestGroupWorld.testPickle) ... ok testProperties (test_group.TestGroupWorld.testProperties) ... ok testCompare (test_group.TestGroupSelf.testCompare) ... ok testDifference (test_group.TestGroupSelf.testDifference) ... ok testDup (test_group.TestGroupSelf.testDup) ... ok testExcl (test_group.TestGroupSelf.testExcl) ... ok testIncl (test_group.TestGroupSelf.testIncl) ... ok testIntersection (test_group.TestGroupSelf.testIntersection) ... ok testPickle (test_group.TestGroupSelf.testPickle) ... ok testProperties (test_group.TestGroupSelf.testProperties) ... ok testRangeExcl (test_group.TestGroupSelf.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupSelf.testRangeIncl) ... ok testRank (test_group.TestGroupSelf.testRank) ... ok testSize (test_group.TestGroupSelf.testSize) ... ok testTranslRanks (test_group.TestGroupSelf.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupSelf.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupSelf.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupSelf.testUnion) ... ok testCompare (test_group.TestGroupWorld.testCompare) ... ok testDifference (test_group.TestGroupWorld.testDifference) ... ok testDup (test_group.TestGroupWorld.testDup) ... ok testExcl (test_group.TestGroupWorld.testExcl) ... ok testIncl (test_group.TestGroupWorld.testIncl) ... ok testIntersection (test_group.TestGroupWorld.testIntersection) ... ok testPickle (test_group.TestGroupWorld.testPickle) ... ok testProperties (test_group.TestGroupWorld.testProperties) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] ok testProperties (test_group.TestGroupWorld.testProperties) ... ok testRangeExcl (test_group.TestGroupWorld.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupWorld.testRangeIncl) ... ok testRank (test_group.TestGroupWorld.testRank) ... ok testSize (test_group.TestGroupWorld.testSize) ... ok testTranslRanks (test_group.TestGroupWorld.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupWorld.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupWorld.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupWorld.testUnion) ... ok testCreate (test_info.TestInfo.testCreate) ... ok testCreateBad (test_info.TestInfo.testCreateBad) ... ok testDup (test_info.TestInfo.testDup) ... ok testGet (test_info.TestInfo.testGet) ... ok testGetNKeys (test_info.TestInfo.testGetNKeys) ... ok testGetSetDelete (test_info.TestInfo.testGetSetDelete) ... ok testPickle (test_info.TestInfo.testPickle) ... ok testPyMethods (test_info.TestInfo.testPyMethods) ... ok testTruth (test_info.TestInfo.testTruth) ... ok testCreateEnv (test_info.TestInfoEnv.testCreateEnv) ... ok testDup (test_info.TestInfoEnv.testDup) ... Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testUnion (test_group.TestGroupSelf.testUnion) ... ok testCompare (test_group.TestGroupWorld.testCompare) ... ok testDifference (test_group.TestGroupWorld.testDifference) ... ok testDup (test_group.TestGroupWorld.testDup) ... ok testExcl (test_group.TestGroupWorld.testExcl) ... ok testIncl (test_group.TestGroupWorld.testIncl) ... ok testIntersection (test_group.TestGroupWorld.testIntersection) ... ok testPickle (test_group.TestGroupWorld.testPickle) ... ok testProperties (test_group.TestGroupWorld.testProperties) ... ok testRangeExcl (test_group.TestGroupWorld.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupWorld.testRangeIncl) ... ok testRank (test_group.TestGroupWorld.testRank) ... ok testSize (test_group.TestGroupWorld.testSize) ... ok testTranslRanks (test_group.TestGroupWorld.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupWorld.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupWorld.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupWorld.testUnion) ... ok ok testRangeExcl (test_group.TestGroupWorld.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupWorld.testRangeIncl) ... ok testRank (test_group.TestGroupWorld.testRank) ... ok testSize (test_group.TestGroupWorld.testSize) ... ok testTranslRanks (test_group.TestGroupWorld.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupWorld.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupWorld.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupWorld.testUnion) ... ok testCreate (test_info.TestInfo.testCreate) ... ok testCreateBad (test_info.TestInfo.testCreateBad) ... ok testDup (test_info.TestInfo.testDup) ... ok testGet (test_info.TestInfo.testGet) ... ok testGetNKeys (test_info.TestInfo.testGetNKeys) ... ok testGetSetDelete (test_info.TestInfo.testGetSetDelete) ... ok testPickle (test_info.TestInfo.testPickle) ... ok testPyMethods (test_info.TestInfo.testPyMethods) ... ok testTruth (test_info.TestInfo.testTruth) ... ok testCreateEnv (test_info.TestInfoEnv.testCreateEnv) ... ok testDup (test_info.TestInfoEnv.testDup) ... ok testRangeExcl (test_group.TestGroupWorld.testRangeExcl) ... ok testRangeIncl (test_group.TestGroupWorld.testRangeIncl) ... ok testRank (test_group.TestGroupWorld.testRank) ... ok testSize (test_group.TestGroupWorld.testSize) ... ok testTranslRanks (test_group.TestGroupWorld.testTranslRanks) ... ok testTranslRanksGroupEmpty (test_group.TestGroupWorld.testTranslRanksGroupEmpty) ... ok testTranslRanksProcNull (test_group.TestGroupWorld.testTranslRanksProcNull) ... ok testUnion (test_group.TestGroupWorld.testUnion) ... ok testCreate (test_info.TestInfo.testCreate) ... ok testCreateBad (test_info.TestInfo.testCreateBad) ... ok testDup (test_info.TestInfo.testDup) ... ok testGet (test_info.TestInfo.testGet) ... ok testGetNKeys (test_info.TestInfo.testGetNKeys) ... ok testGetSetDelete (test_info.TestInfo.testGetSetDelete) ... ok testPickle (test_info.TestInfo.testPickle) ... ok testPyMethods (test_info.TestInfo.testPyMethods) ... ok testTruth (test_info.TestInfo.testTruth) ... ok testCreateEnv (test_info.TestInfoEnv.testCreateEnv) ... ok testDup (test_info.TestInfoEnv.testDup) ... ok testPickle (test_info.TestInfoEnv.testPickle) ... ok ok testPickle (test_info.TestInfoEnv.testPickle) ... ok testPyMethods (test_info.TestInfoEnv.testPyMethods) ... ok testTruth (test_info.TestInfoEnv.testTruth) ... ok testPickle (test_info.TestInfoNull.testPickle) ... ok testPyMethods (test_info.TestInfoNull.testPyMethods) ... ok testTruth (test_info.TestInfoNull.testTruth) ... ok testRegister (test_io.TestDatarep.testRegister) ... ok testIReadIWrite (test_io.TestIOBasicSelf.testIReadIWrite) ... testCreate (test_info.TestInfo.testCreate) ... ok testCreateBad (test_info.TestInfo.testCreateBad) ... ok testDup (test_info.TestInfo.testDup) ... ok testGet (test_info.TestInfo.testGet) ... ok testGetNKeys (test_info.TestInfo.testGetNKeys) ... ok testGetSetDelete (test_info.TestInfo.testGetSetDelete) ... ok testPickle (test_info.TestInfo.testPickle) ... ok testPyMethods (test_info.TestInfo.testPyMethods) ... ok testTruth (test_info.TestInfo.testTruth) ... ok testCreateEnv (test_info.TestInfoEnv.testCreateEnv) ... ok testDup (test_info.TestInfoEnv.testDup) ... ok testPickle (test_info.TestInfoEnv.testPickle) ... ok testPyMethods (test_info.TestInfoEnv.testPyMethods) ... ok testTruth (test_info.TestInfoEnv.testTruth) ... ok testPickle (test_info.TestInfoNull.testPickle) ... testPickle (test_info.TestInfoEnv.testPickle) ... ok testPyMethods (test_info.TestInfoEnv.testPyMethods) ... ok testTruth (test_info.TestInfoEnv.testTruth) ... ok testPickle (test_info.TestInfoNull.testPickle) ... ok testPyMethods (test_info.TestInfoNull.testPyMethods) ... ok testTruth (test_info.TestInfoNull.testTruth) ... ok testRegister (test_io.TestDatarep.testRegister) ... ok testIReadIWrite (test_io.TestIOBasicSelf.testIReadIWrite) ... ok testPyMethods (test_info.TestInfoEnv.testPyMethods) ... ok testTruth (test_info.TestInfoEnv.testTruth) ... ok testPickle (test_info.TestInfoNull.testPickle) ... ok testPyMethods (test_info.TestInfoNull.testPyMethods) ... ok testTruth (test_info.TestInfoNull.testTruth) ... ok testRegister (test_io.TestDatarep.testRegister) ... ok testIReadIWrite (test_io.TestIOBasicSelf.testIReadIWrite) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyMethods (test_info.TestInfoNull.testPyMethods) ... ok testTruth (test_info.TestInfoNull.testTruth) ... ok testRegister (test_io.TestDatarep.testRegister) ... ok testIReadIWrite (test_io.TestIOBasicSelf.testIReadIWrite) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteAll (test_io.TestIOBasicSelf.testIReadIWriteAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteAll (test_io.TestIOBasicSelf.testIReadIWriteAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteAll (test_io.TestIOBasicSelf.testIReadIWriteAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testIReadIWriteAll (test_io.TestIOBasicSelf.testIReadIWriteAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteAt (test_io.TestIOBasicSelf.testIReadIWriteAt) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testIReadIWriteAt (test_io.TestIOBasicSelf.testIReadIWriteAt) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteAt (test_io.TestIOBasicSelf.testIReadIWriteAt) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteAt (test_io.TestIOBasicSelf.testIReadIWriteAt) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteAtAll (test_io.TestIOBasicSelf.testIReadIWriteAtAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteAtAll (test_io.TestIOBasicSelf.testIReadIWriteAtAll) ... ok testIReadIWriteAtAll (test_io.TestIOBasicSelf.testIReadIWriteAtAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteAtAll (test_io.TestIOBasicSelf.testIReadIWriteAtAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteShared (test_io.TestIOBasicSelf.testIReadIWriteShared) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteShared (test_io.TestIOBasicSelf.testIReadIWriteShared) ... ok testIReadIWriteShared (test_io.TestIOBasicSelf.testIReadIWriteShared) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteShared (test_io.TestIOBasicSelf.testIReadIWriteShared) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWrite (test_io.TestIOBasicSelf.testReadWrite) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWrite (test_io.TestIOBasicSelf.testReadWrite) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testReadWrite (test_io.TestIOBasicSelf.testReadWrite) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWrite (test_io.TestIOBasicSelf.testReadWrite) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAll (test_io.TestIOBasicSelf.testReadWriteAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAll (test_io.TestIOBasicSelf.testReadWriteAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAll (test_io.TestIOBasicSelf.testReadWriteAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testReadWriteAll (test_io.TestIOBasicSelf.testReadWriteAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAllBeginEnd (test_io.TestIOBasicSelf.testReadWriteAllBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAllBeginEnd (test_io.TestIOBasicSelf.testReadWriteAllBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAllBeginEnd (test_io.TestIOBasicSelf.testReadWriteAllBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAllBeginEnd (test_io.TestIOBasicSelf.testReadWriteAllBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAt (test_io.TestIOBasicSelf.testReadWriteAt) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAt (test_io.TestIOBasicSelf.testReadWriteAt) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAt (test_io.TestIOBasicSelf.testReadWriteAt) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAt (test_io.TestIOBasicSelf.testReadWriteAt) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAtAll (test_io.TestIOBasicSelf.testReadWriteAtAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAtAll (test_io.TestIOBasicSelf.testReadWriteAtAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testReadWriteAtAll (test_io.TestIOBasicSelf.testReadWriteAtAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testReadWriteAtAll (test_io.TestIOBasicSelf.testReadWriteAtAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAtAllBeginEnd (test_io.TestIOBasicSelf.testReadWriteAtAllBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAtAllBeginEnd (test_io.TestIOBasicSelf.testReadWriteAtAllBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAtAllBeginEnd (test_io.TestIOBasicSelf.testReadWriteAtAllBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAtAllBeginEnd (test_io.TestIOBasicSelf.testReadWriteAtAllBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteOrdered (test_io.TestIOBasicSelf.testReadWriteOrdered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteOrdered (test_io.TestIOBasicSelf.testReadWriteOrdered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteOrdered (test_io.TestIOBasicSelf.testReadWriteOrdered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteOrdered (test_io.TestIOBasicSelf.testReadWriteOrdered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteOrderedBeginEnd (test_io.TestIOBasicSelf.testReadWriteOrderedBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteOrderedBeginEnd (test_io.TestIOBasicSelf.testReadWriteOrderedBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteOrderedBeginEnd (test_io.TestIOBasicSelf.testReadWriteOrderedBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteOrderedBeginEnd (test_io.TestIOBasicSelf.testReadWriteOrderedBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteShared (test_io.TestIOBasicSelf.testReadWriteShared) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteShared (test_io.TestIOBasicSelf.testReadWriteShared) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testReadWriteShared (test_io.TestIOBasicSelf.testReadWriteShared) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteShared (test_io.TestIOBasicSelf.testReadWriteShared) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWrite (test_io.TestIOBasicWorld.testIReadIWrite) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testIReadIWrite (test_io.TestIOBasicWorld.testIReadIWrite) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWrite (test_io.TestIOBasicWorld.testIReadIWrite) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWrite (test_io.TestIOBasicWorld.testIReadIWrite) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteAll (test_io.TestIOBasicWorld.testIReadIWriteAll) ... ok testIReadIWriteAll (test_io.TestIOBasicWorld.testIReadIWriteAll) ... ok testIReadIWriteAll (test_io.TestIOBasicWorld.testIReadIWriteAll) ... ok testIReadIWriteAll (test_io.TestIOBasicWorld.testIReadIWriteAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteAt (test_io.TestIOBasicWorld.testIReadIWriteAt) ... ok testIReadIWriteAt (test_io.TestIOBasicWorld.testIReadIWriteAt) ... ok testIReadIWriteAt (test_io.TestIOBasicWorld.testIReadIWriteAt) ... ok testIReadIWriteAt (test_io.TestIOBasicWorld.testIReadIWriteAt) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteAtAll (test_io.TestIOBasicWorld.testIReadIWriteAtAll) ... ok testIReadIWriteAtAll (test_io.TestIOBasicWorld.testIReadIWriteAtAll) ... ok testIReadIWriteAtAll (test_io.TestIOBasicWorld.testIReadIWriteAtAll) ... ok testIReadIWriteAtAll (test_io.TestIOBasicWorld.testIReadIWriteAtAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIReadIWriteShared (test_io.TestIOBasicWorld.testIReadIWriteShared) ... ok testIReadIWriteShared (test_io.TestIOBasicWorld.testIReadIWriteShared) ... ok testIReadIWriteShared (test_io.TestIOBasicWorld.testIReadIWriteShared) ... ok testIReadIWriteShared (test_io.TestIOBasicWorld.testIReadIWriteShared) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWrite (test_io.TestIOBasicWorld.testReadWrite) ... ok testReadWrite (test_io.TestIOBasicWorld.testReadWrite) ... ok testReadWrite (test_io.TestIOBasicWorld.testReadWrite) ... ok testReadWrite (test_io.TestIOBasicWorld.testReadWrite) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAll (test_io.TestIOBasicWorld.testReadWriteAll) ... ok testReadWriteAll (test_io.TestIOBasicWorld.testReadWriteAll) ... ok testReadWriteAll (test_io.TestIOBasicWorld.testReadWriteAll) ... ok testReadWriteAll (test_io.TestIOBasicWorld.testReadWriteAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] ok testReadWriteAllBeginEnd (test_io.TestIOBasicWorld.testReadWriteAllBeginEnd) ... Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAllBeginEnd (test_io.TestIOBasicWorld.testReadWriteAllBeginEnd) ... ok testReadWriteAllBeginEnd (test_io.TestIOBasicWorld.testReadWriteAllBeginEnd) ... ok testReadWriteAllBeginEnd (test_io.TestIOBasicWorld.testReadWriteAllBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAt (test_io.TestIOBasicWorld.testReadWriteAt) ... ok testReadWriteAt (test_io.TestIOBasicWorld.testReadWriteAt) ... ok testReadWriteAt (test_io.TestIOBasicWorld.testReadWriteAt) ... ok testReadWriteAt (test_io.TestIOBasicWorld.testReadWriteAt) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAtAll (test_io.TestIOBasicWorld.testReadWriteAtAll) ... ok testReadWriteAtAll (test_io.TestIOBasicWorld.testReadWriteAtAll) ... ok testReadWriteAtAll (test_io.TestIOBasicWorld.testReadWriteAtAll) ... ok testReadWriteAtAll (test_io.TestIOBasicWorld.testReadWriteAtAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteAtAllBeginEnd (test_io.TestIOBasicWorld.testReadWriteAtAllBeginEnd) ... ok testReadWriteAtAllBeginEnd (test_io.TestIOBasicWorld.testReadWriteAtAllBeginEnd) ... ok testReadWriteAtAllBeginEnd (test_io.TestIOBasicWorld.testReadWriteAtAllBeginEnd) ... ok testReadWriteAtAllBeginEnd (test_io.TestIOBasicWorld.testReadWriteAtAllBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] ok testReadWriteOrdered (test_io.TestIOBasicWorld.testReadWriteOrdered) ... Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteOrdered (test_io.TestIOBasicWorld.testReadWriteOrdered) ... ok testReadWriteOrdered (test_io.TestIOBasicWorld.testReadWriteOrdered) ... ok testReadWriteOrdered (test_io.TestIOBasicWorld.testReadWriteOrdered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteOrderedBeginEnd (test_io.TestIOBasicWorld.testReadWriteOrderedBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteOrderedBeginEnd (test_io.TestIOBasicWorld.testReadWriteOrderedBeginEnd) ... ok testReadWriteOrderedBeginEnd (test_io.TestIOBasicWorld.testReadWriteOrderedBeginEnd) ... ok testReadWriteOrderedBeginEnd (test_io.TestIOBasicWorld.testReadWriteOrderedBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReadWriteShared (test_io.TestIOBasicWorld.testReadWriteShared) ... ok testReadWriteShared (test_io.TestIOBasicWorld.testReadWriteShared) ... ok testReadWriteShared (test_io.TestIOBasicWorld.testReadWriteShared) ... ok testReadWriteShared (test_io.TestIOBasicWorld.testReadWriteShared) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testContiguous (test_io.TestIOViewSelf.testContiguous) ... ok testContiguous (test_io.TestIOViewSelf.testContiguous) ... ok testContiguous (test_io.TestIOViewSelf.testContiguous) ... ok testContiguous (test_io.TestIOViewSelf.testContiguous) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarrayBlock (test_io.TestIOViewSelf.testDarrayBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarrayBlock (test_io.TestIOViewSelf.testDarrayBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarrayBlock (test_io.TestIOViewSelf.testDarrayBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarrayBlock (test_io.TestIOViewSelf.testDarrayBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarrayCyclic (test_io.TestIOViewSelf.testDarrayCyclic) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarrayCyclic (test_io.TestIOViewSelf.testDarrayCyclic) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testDarrayCyclic (test_io.TestIOViewSelf.testDarrayCyclic) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarrayCyclic (test_io.TestIOViewSelf.testDarrayCyclic) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDup (test_io.TestIOViewSelf.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testDup (test_io.TestIOViewSelf.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDup (test_io.TestIOViewSelf.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDup (test_io.TestIOViewSelf.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHIndexed (test_io.TestIOViewSelf.testHIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHIndexed (test_io.TestIOViewSelf.testHIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHIndexed (test_io.TestIOViewSelf.testHIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHIndexed (test_io.TestIOViewSelf.testHIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHIndexedBlock (test_io.TestIOViewSelf.testHIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHIndexedBlock (test_io.TestIOViewSelf.testHIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHIndexedBlock (test_io.TestIOViewSelf.testHIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHIndexedBlock (test_io.TestIOViewSelf.testHIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHVector (test_io.TestIOViewSelf.testHVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHVector (test_io.TestIOViewSelf.testHVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHVector (test_io.TestIOViewSelf.testHVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHVector (test_io.TestIOViewSelf.testHVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_io.TestIOViewSelf.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_io.TestIOViewSelf.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_io.TestIOViewSelf.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_io.TestIOViewSelf.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexedBlock (test_io.TestIOViewSelf.testIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexedBlock (test_io.TestIOViewSelf.testIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexedBlock (test_io.TestIOViewSelf.testIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexedBlock (test_io.TestIOViewSelf.testIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNamed (test_io.TestIOViewSelf.testNamed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNamed (test_io.TestIOViewSelf.testNamed) ... ok testNamed (test_io.TestIOViewSelf.testNamed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNamed (test_io.TestIOViewSelf.testNamed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_io.TestIOViewSelf.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_io.TestIOViewSelf.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_io.TestIOViewSelf.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_io.TestIOViewSelf.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_io.TestIOViewSelf.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_io.TestIOViewSelf.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_io.TestIOViewSelf.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_io.TestIOViewSelf.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testVector (test_io.TestIOViewSelf.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testVector (test_io.TestIOViewSelf.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testVector (test_io.TestIOViewSelf.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testVector (test_io.TestIOViewSelf.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testContiguous (test_io.TestIOViewWorld.testContiguous) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testContiguous (test_io.TestIOViewWorld.testContiguous) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testContiguous (test_io.TestIOViewWorld.testContiguous) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testContiguous (test_io.TestIOViewWorld.testContiguous) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarrayBlock (test_io.TestIOViewWorld.testDarrayBlock) ... ok testDarrayBlock (test_io.TestIOViewWorld.testDarrayBlock) ... ok testDarrayBlock (test_io.TestIOViewWorld.testDarrayBlock) ... ok testDarrayBlock (test_io.TestIOViewWorld.testDarrayBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDarrayCyclic (test_io.TestIOViewWorld.testDarrayCyclic) ... ok testDarrayCyclic (test_io.TestIOViewWorld.testDarrayCyclic) ... ok testDarrayCyclic (test_io.TestIOViewWorld.testDarrayCyclic) ... ok testDarrayCyclic (test_io.TestIOViewWorld.testDarrayCyclic) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDup (test_io.TestIOViewWorld.testDup) ... ok testDup (test_io.TestIOViewWorld.testDup) ... ok testDup (test_io.TestIOViewWorld.testDup) ... ok testDup (test_io.TestIOViewWorld.testDup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHIndexed (test_io.TestIOViewWorld.testHIndexed) ... ok testHIndexed (test_io.TestIOViewWorld.testHIndexed) ... ok testHIndexed (test_io.TestIOViewWorld.testHIndexed) ... ok testHIndexed (test_io.TestIOViewWorld.testHIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHIndexedBlock (test_io.TestIOViewWorld.testHIndexedBlock) ... ok testHIndexedBlock (test_io.TestIOViewWorld.testHIndexedBlock) ... ok testHIndexedBlock (test_io.TestIOViewWorld.testHIndexedBlock) ... ok testHIndexedBlock (test_io.TestIOViewWorld.testHIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHVector (test_io.TestIOViewWorld.testHVector) ... ok testHVector (test_io.TestIOViewWorld.testHVector) ... ok testHVector (test_io.TestIOViewWorld.testHVector) ... ok testHVector (test_io.TestIOViewWorld.testHVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_io.TestIOViewWorld.testIndexed) ... ok testIndexed (test_io.TestIOViewWorld.testIndexed) ... ok testIndexed (test_io.TestIOViewWorld.testIndexed) ... ok testIndexed (test_io.TestIOViewWorld.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexedBlock (test_io.TestIOViewWorld.testIndexedBlock) ... ok testIndexedBlock (test_io.TestIOViewWorld.testIndexedBlock) ... ok testIndexedBlock (test_io.TestIOViewWorld.testIndexedBlock) ... ok testIndexedBlock (test_io.TestIOViewWorld.testIndexedBlock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNamed (test_io.TestIOViewWorld.testNamed) ... ok testNamed (test_io.TestIOViewWorld.testNamed) ... ok testNamed (test_io.TestIOViewWorld.testNamed) ... ok testNamed (test_io.TestIOViewWorld.testNamed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct (test_io.TestIOViewWorld.testStruct) ... ok testStruct (test_io.TestIOViewWorld.testStruct) ... ok testStruct (test_io.TestIOViewWorld.testStruct) ... ok testStruct (test_io.TestIOViewWorld.testStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_io.TestIOViewWorld.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray (test_io.TestIOViewWorld.testSubarray) ... ok testSubarray (test_io.TestIOViewWorld.testSubarray) ... ok testSubarray (test_io.TestIOViewWorld.testSubarray) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testVector (test_io.TestIOViewWorld.testVector) ... ok testVector (test_io.TestIOViewWorld.testVector) ... ok testVector (test_io.TestIOViewWorld.testVector) ... ok testVector (test_io.TestIOViewWorld.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testLargeCountSymbols (test_mpiapi.TestMPIAPI.testLargeCountSymbols) ... ok testLargeCountSymbols (test_mpiapi.TestMPIAPI.testLargeCountSymbols) ... ok testLargeCountSymbols (test_mpiapi.TestMPIAPI.testLargeCountSymbols) ... ok testLargeCountSymbols (test_mpiapi.TestMPIAPI.testLargeCountSymbols) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSymbolCoverage (test_mpiapi.TestMPIAPI.testSymbolCoverage) ... ok testSymbolCoverage (test_mpiapi.TestMPIAPI.testSymbolCoverage) ... ok testSymbolCoverage (test_mpiapi.TestMPIAPI.testSymbolCoverage) ... ok testSymbolCoverage (test_mpiapi.TestMPIAPI.testSymbolCoverage) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMemory1 (test_mpimem.TestMemory.testMemory1) ... ok testMemory2 (test_mpimem.TestMemory.testMemory2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMemory1 (test_mpimem.TestMemory.testMemory1) ... ok testMemory1 (test_mpimem.TestMemory.testMemory1) ... ok testMemory2 (test_mpimem.TestMemory.testMemory2) ... ok testMessageBad (test_msgspec.TestMessageBlock.testMessageBad) ... ok testAttrEmpty (test_msgspec.TestMessageCAIBuf.testAttrEmpty) ... ok testAttrNone (test_msgspec.TestMessageCAIBuf.testAttrNone) ... ok testAttrType (test_msgspec.TestMessageCAIBuf.testAttrType) ... ok testDataMissing (test_msgspec.TestMessageCAIBuf.testDataMissing) ... ok testDataNone (test_msgspec.TestMessageCAIBuf.testDataNone) ... ok testDataType (test_msgspec.TestMessageCAIBuf.testDataType) ... ok testDataValue (test_msgspec.TestMessageCAIBuf.testDataValue) ... ok testDescrMissing (test_msgspec.TestMessageCAIBuf.testDescrMissing) ... ok testDescrNone (test_msgspec.TestMessageCAIBuf.testDescrNone) ... ok testDescrType (test_msgspec.TestMessageCAIBuf.testDescrType) ... ok testDescrWarning (test_msgspec.TestMessageCAIBuf.testDescrWarning) ... ok testMemory2 (test_mpimem.TestMemory.testMemory2) ... ok testMessageBad (test_msgspec.TestMessageBlock.testMessageBad) ... ok testAttrEmpty (test_msgspec.TestMessageCAIBuf.testAttrEmpty) ... ok testAttrNone (test_msgspec.TestMessageCAIBuf.testAttrNone) ... ok testAttrType (test_msgspec.TestMessageCAIBuf.testAttrType) ... ok testDataMissing (test_msgspec.TestMessageCAIBuf.testDataMissing) ... ok testDataNone (test_msgspec.TestMessageCAIBuf.testDataNone) ... ok testDataType (test_msgspec.TestMessageCAIBuf.testDataType) ... ok testDataValue (test_msgspec.TestMessageCAIBuf.testDataValue) ... ok testDescrMissing (test_msgspec.TestMessageCAIBuf.testDescrMissing) ... ok testDescrNone (test_msgspec.TestMessageCAIBuf.testDescrNone) ... ok testMessageBad (test_msgspec.TestMessageBlock.testMessageBad) ... ok testAttrEmpty (test_msgspec.TestMessageCAIBuf.testAttrEmpty) ... ok testAttrNone (test_msgspec.TestMessageCAIBuf.testAttrNone) ... ok testAttrType (test_msgspec.TestMessageCAIBuf.testAttrType) ... ok testDataMissing (test_msgspec.TestMessageCAIBuf.testDataMissing) ... ok testDataNone (test_msgspec.TestMessageCAIBuf.testDataNone) ... ok testDataType (test_msgspec.TestMessageCAIBuf.testDataType) ... ok testDataValue (test_msgspec.TestMessageCAIBuf.testDataValue) ... ok testDescrMissing (test_msgspec.TestMessageCAIBuf.testDescrMissing) ... ok testDescrNone (test_msgspec.TestMessageCAIBuf.testDescrNone) ... ok testDescrType (test_msgspec.TestMessageCAIBuf.testDescrType) ... ok testDescrWarning (test_msgspec.TestMessageCAIBuf.testDescrWarning) ... ok testMemory1 (test_mpimem.TestMemory.testMemory1) ... ok testMemory2 (test_mpimem.TestMemory.testMemory2) ... ok testMessageBad (test_msgspec.TestMessageBlock.testMessageBad) ... ok testAttrEmpty (test_msgspec.TestMessageCAIBuf.testAttrEmpty) ... ok testAttrNone (test_msgspec.TestMessageCAIBuf.testAttrNone) ... ok testAttrType (test_msgspec.TestMessageCAIBuf.testAttrType) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMask (test_msgspec.TestMessageCAIBuf.testMask) ... ok testNonContiguous (test_msgspec.TestMessageCAIBuf.testNonContiguous) ... ok testReadonly (test_msgspec.TestMessageCAIBuf.testReadonly) ... ok testShapeMissing (test_msgspec.TestMessageCAIBuf.testShapeMissing) ... ok testShapeNone (test_msgspec.TestMessageCAIBuf.testShapeNone) ... ok testShapeType (test_msgspec.TestMessageCAIBuf.testShapeType) ... ok testShapeValue (test_msgspec.TestMessageCAIBuf.testShapeValue) ... ok testStridesMissing (test_msgspec.TestMessageCAIBuf.testStridesMissing) ... ok testStridesNone (test_msgspec.TestMessageCAIBuf.testStridesNone) ... ok testStridesType (test_msgspec.TestMessageCAIBuf.testStridesType) ... ok testTypestrEndian (test_msgspec.TestMessageCAIBuf.testTypestrEndian) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDescrType (test_msgspec.TestMessageCAIBuf.testDescrType) ... ok testDescrWarning (test_msgspec.TestMessageCAIBuf.testDescrWarning) ... ok testMask (test_msgspec.TestMessageCAIBuf.testMask) ... ok testNonContiguous (test_msgspec.TestMessageCAIBuf.testNonContiguous) ... ok testReadonly (test_msgspec.TestMessageCAIBuf.testReadonly) ... ok testShapeMissing (test_msgspec.TestMessageCAIBuf.testShapeMissing) ... ok testShapeNone (test_msgspec.TestMessageCAIBuf.testShapeNone) ... ok testShapeType (test_msgspec.TestMessageCAIBuf.testShapeType) ... ok testShapeValue (test_msgspec.TestMessageCAIBuf.testShapeValue) ... ok testMask (test_msgspec.TestMessageCAIBuf.testMask) ... ok testNonContiguous (test_msgspec.TestMessageCAIBuf.testNonContiguous) ... ok testReadonly (test_msgspec.TestMessageCAIBuf.testReadonly) ... ok testShapeMissing (test_msgspec.TestMessageCAIBuf.testShapeMissing) ... ok testShapeNone (test_msgspec.TestMessageCAIBuf.testShapeNone) ... ok testShapeType (test_msgspec.TestMessageCAIBuf.testShapeType) ... ok testShapeValue (test_msgspec.TestMessageCAIBuf.testShapeValue) ... ok testStridesMissing (test_msgspec.TestMessageCAIBuf.testStridesMissing) ... ok testStridesNone (test_msgspec.TestMessageCAIBuf.testStridesNone) ... ok testStridesType (test_msgspec.TestMessageCAIBuf.testStridesType) ... ok testTypestrEndian (test_msgspec.TestMessageCAIBuf.testTypestrEndian) ... ok testDataMissing (test_msgspec.TestMessageCAIBuf.testDataMissing) ... ok testDataNone (test_msgspec.TestMessageCAIBuf.testDataNone) ... ok testDataType (test_msgspec.TestMessageCAIBuf.testDataType) ... ok testDataValue (test_msgspec.TestMessageCAIBuf.testDataValue) ... ok testDescrMissing (test_msgspec.TestMessageCAIBuf.testDescrMissing) ... ok testDescrNone (test_msgspec.TestMessageCAIBuf.testDescrNone) ... ok testDescrType (test_msgspec.TestMessageCAIBuf.testDescrType) ... ok testDescrWarning (test_msgspec.TestMessageCAIBuf.testDescrWarning) ... ok testTypestrItemsize (test_msgspec.TestMessageCAIBuf.testTypestrItemsize) ... ok testTypestrMissing (test_msgspec.TestMessageCAIBuf.testTypestrMissing) ... ok testTypestrNone (test_msgspec.TestMessageCAIBuf.testTypestrNone) ... ok testTypestrType (test_msgspec.TestMessageCAIBuf.testTypestrType) ... ok testByteOffset (test_msgspec.TestMessageDLPackCPUBuf.testByteOffset) ... ok testCapsule (test_msgspec.TestMessageDLPackCPUBuf.testCapsule) ... ok testContiguous (test_msgspec.TestMessageDLPackCPUBuf.testContiguous) ... ok testStridesMissing (test_msgspec.TestMessageCAIBuf.testStridesMissing) ... ok testStridesNone (test_msgspec.TestMessageCAIBuf.testStridesNone) ... ok testStridesType (test_msgspec.TestMessageCAIBuf.testStridesType) ... ok testTypestrEndian (test_msgspec.TestMessageCAIBuf.testTypestrEndian) ... ok testTypestrItemsize (test_msgspec.TestMessageCAIBuf.testTypestrItemsize) ... ok testTypestrMissing (test_msgspec.TestMessageCAIBuf.testTypestrMissing) ... ok testTypestrNone (test_msgspec.TestMessageCAIBuf.testTypestrNone) ... ok testTypestrType (test_msgspec.TestMessageCAIBuf.testTypestrType) ... ok testByteOffset (test_msgspec.TestMessageDLPackCPUBuf.testByteOffset) ... ok testCapsule (test_msgspec.TestMessageDLPackCPUBuf.testCapsule) ... ok testTypestrItemsize (test_msgspec.TestMessageCAIBuf.testTypestrItemsize) ... ok testTypestrMissing (test_msgspec.TestMessageCAIBuf.testTypestrMissing) ... ok testTypestrNone (test_msgspec.TestMessageCAIBuf.testTypestrNone) ... ok testTypestrType (test_msgspec.TestMessageCAIBuf.testTypestrType) ... ok testByteOffset (test_msgspec.TestMessageDLPackCPUBuf.testByteOffset) ... ok testCapsule (test_msgspec.TestMessageDLPackCPUBuf.testCapsule) ... ok testContiguous (test_msgspec.TestMessageDLPackCPUBuf.testContiguous) ... ok testMask (test_msgspec.TestMessageCAIBuf.testMask) ... ok testNonContiguous (test_msgspec.TestMessageCAIBuf.testNonContiguous) ... ok testReadonly (test_msgspec.TestMessageCAIBuf.testReadonly) ... ok testShapeMissing (test_msgspec.TestMessageCAIBuf.testShapeMissing) ... ok testShapeNone (test_msgspec.TestMessageCAIBuf.testShapeNone) ... ok testShapeType (test_msgspec.TestMessageCAIBuf.testShapeType) ... ok testShapeValue (test_msgspec.TestMessageCAIBuf.testShapeValue) ... ok testStridesMissing (test_msgspec.TestMessageCAIBuf.testStridesMissing) ... ok testStridesNone (test_msgspec.TestMessageCAIBuf.testStridesNone) ... ok testStridesType (test_msgspec.TestMessageCAIBuf.testStridesType) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDevice (test_msgspec.TestMessageDLPackCPUBuf.testDevice) ... ok testDtypeCode (test_msgspec.TestMessageDLPackCPUBuf.testDtypeCode) ... ok testDtypeLanes (test_msgspec.TestMessageDLPackCPUBuf.testDtypeLanes) ... ok testNdim (test_msgspec.TestMessageDLPackCPUBuf.testNdim) ... ok testReadonly (test_msgspec.TestMessageDLPackCPUBuf.testReadonly) ... ok testShape (test_msgspec.TestMessageDLPackCPUBuf.testShape) ... ok testStrides (test_msgspec.TestMessageDLPackCPUBuf.testStrides) ... ok testVersion (test_msgspec.TestMessageDLPackCPUBuf.testVersion) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testContiguous (test_msgspec.TestMessageDLPackCPUBuf.testContiguous) ... ok testDevice (test_msgspec.TestMessageDLPackCPUBuf.testDevice) ... ok testDtypeCode (test_msgspec.TestMessageDLPackCPUBuf.testDtypeCode) ... ok testDtypeLanes (test_msgspec.TestMessageDLPackCPUBuf.testDtypeLanes) ... ok testDevice (test_msgspec.TestMessageDLPackCPUBuf.testDevice) ... ok testDtypeCode (test_msgspec.TestMessageDLPackCPUBuf.testDtypeCode) ... ok testDtypeLanes (test_msgspec.TestMessageDLPackCPUBuf.testDtypeLanes) ... ok testNdim (test_msgspec.TestMessageDLPackCPUBuf.testNdim) ... ok testReadonly (test_msgspec.TestMessageDLPackCPUBuf.testReadonly) ... ok testShape (test_msgspec.TestMessageDLPackCPUBuf.testShape) ... ok testStrides (test_msgspec.TestMessageDLPackCPUBuf.testStrides) ... ok testTypestrEndian (test_msgspec.TestMessageCAIBuf.testTypestrEndian) ... ok testTypestrItemsize (test_msgspec.TestMessageCAIBuf.testTypestrItemsize) ... ok testTypestrMissing (test_msgspec.TestMessageCAIBuf.testTypestrMissing) ... ok testTypestrNone (test_msgspec.TestMessageCAIBuf.testTypestrNone) ... ok testTypestrType (test_msgspec.TestMessageCAIBuf.testTypestrType) ... ok testByteOffset (test_msgspec.TestMessageDLPackCPUBuf.testByteOffset) ... ok ok testMessageArray (test_msgspec.TestMessageRMA.testMessageArray) ... ok testMessageBad (test_msgspec.TestMessageRMA.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageRMA.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageRMA.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageRMA.testMessageBytes) ... ok testMessageCAIBuf (test_msgspec.TestMessageRMA.testMessageCAIBuf) ... ok testMessageCuPy (test_msgspec.TestMessageRMA.testMessageCuPy) ... skipped 'cupy' testMessageNone (test_msgspec.TestMessageRMA.testMessageNone) ... ok testMessageNumPy (test_msgspec.TestMessageRMA.testMessageNumPy) ... ok testMessageNumba (test_msgspec.TestMessageRMA.testMessageNumba) ... skipped 'numba' ok testNdim (test_msgspec.TestMessageDLPackCPUBuf.testNdim) ... ok testReadonly (test_msgspec.TestMessageDLPackCPUBuf.testReadonly) ... ok testShape (test_msgspec.TestMessageDLPackCPUBuf.testShape) ... ok testStrides (test_msgspec.TestMessageDLPackCPUBuf.testStrides) ... ok testVersion (test_msgspec.TestMessageDLPackCPUBuf.testVersion) ... ok testMessageArray (test_msgspec.TestMessageRMA.testMessageArray) ... ok testMessageBad (test_msgspec.TestMessageRMA.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageRMA.testMessageBottom) ... testVersion (test_msgspec.TestMessageDLPackCPUBuf.testVersion) ... ok testMessageArray (test_msgspec.TestMessageRMA.testMessageArray) ... ok testMessageBad (test_msgspec.TestMessageRMA.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageRMA.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageRMA.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageRMA.testMessageBytes) ... ok testMessageCAIBuf (test_msgspec.TestMessageRMA.testMessageCAIBuf) ... ok testMessageCuPy (test_msgspec.TestMessageRMA.testMessageCuPy) ... skipped 'cupy' testMessageNone (test_msgspec.TestMessageRMA.testMessageNone) ... ok testMessageNumPy (test_msgspec.TestMessageRMA.testMessageNumPy) ... ok testCapsule (test_msgspec.TestMessageDLPackCPUBuf.testCapsule) ... ok testContiguous (test_msgspec.TestMessageDLPackCPUBuf.testContiguous) ... ok testDevice (test_msgspec.TestMessageDLPackCPUBuf.testDevice) ... ok testDtypeCode (test_msgspec.TestMessageDLPackCPUBuf.testDtypeCode) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testMessageBad (test_msgspec.TestMessageReduce.testMessageBad) ... ok testMessageBad (test_msgspec.TestMessageReduceScatter.testMessageBad) ... ok testMessageBad (test_msgspec.TestMessageSimple.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageSimple.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageSimple.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageSimple.testMessageBytes) ... ok testMessageMemoryView (test_msgspec.TestMessageSimple.testMessageMemoryView) ... ok testMessageNone (test_msgspec.TestMessageSimple.testMessageNone) ... ok testArray1 (test_msgspec.TestMessageSimpleArray.testArray1) ... ok testMessageBytearray (test_msgspec.TestMessageRMA.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageRMA.testMessageBytes) ... ok testMessageCAIBuf (test_msgspec.TestMessageRMA.testMessageCAIBuf) ... ok testMessageCuPy (test_msgspec.TestMessageRMA.testMessageCuPy) ... skipped 'cupy' testMessageNone (test_msgspec.TestMessageRMA.testMessageNone) ... ok testMessageNumPy (test_msgspec.TestMessageRMA.testMessageNumPy) ... ok testMessageNumba (test_msgspec.TestMessageRMA.testMessageNumba) ... skipped 'numba' testMessageBad (test_msgspec.TestMessageReduce.testMessageBad) ... ok testMessageBad (test_msgspec.TestMessageReduceScatter.testMessageBad) ... ok testMessageBad (test_msgspec.TestMessageSimple.testMessageBad) ... ok testMessageNumba (test_msgspec.TestMessageRMA.testMessageNumba) ... skipped 'numba' testMessageBad (test_msgspec.TestMessageReduce.testMessageBad) ... ok testMessageBad (test_msgspec.TestMessageReduceScatter.testMessageBad) ... ok testMessageBad (test_msgspec.TestMessageSimple.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageSimple.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageSimple.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageSimple.testMessageBytes) ... ok testMessageMemoryView (test_msgspec.TestMessageSimple.testMessageMemoryView) ... ok testMessageNone (test_msgspec.TestMessageSimple.testMessageNone) ... ok testArray1 (test_msgspec.TestMessageSimpleArray.testArray1) ... ok testDtypeLanes (test_msgspec.TestMessageDLPackCPUBuf.testDtypeLanes) ... ok testNdim (test_msgspec.TestMessageDLPackCPUBuf.testNdim) ... ok testReadonly (test_msgspec.TestMessageDLPackCPUBuf.testReadonly) ... ok testShape (test_msgspec.TestMessageDLPackCPUBuf.testShape) ... ok testArray2 (test_msgspec.TestMessageSimpleArray.testArray2) ... ok testMessageBottom (test_msgspec.TestMessageSimple.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageSimple.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageSimple.testMessageBytes) ... ok testMessageMemoryView (test_msgspec.TestMessageSimple.testMessageMemoryView) ... ok testMessageNone (test_msgspec.TestMessageSimple.testMessageNone) ... ok testArray1 (test_msgspec.TestMessageSimpleArray.testArray1) ... ok testArray2 (test_msgspec.TestMessageSimpleArray.testArray2) ... ok testStrides (test_msgspec.TestMessageDLPackCPUBuf.testStrides) ... ok testVersion (test_msgspec.TestMessageDLPackCPUBuf.testVersion) ... ok testMessageArray (test_msgspec.TestMessageRMA.testMessageArray) ... ok testMessageBad (test_msgspec.TestMessageRMA.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageRMA.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageRMA.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageRMA.testMessageBytes) ... ok testMessageCAIBuf (test_msgspec.TestMessageRMA.testMessageCAIBuf) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleArray.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleArray.testArray2) ... ok testArray3 (test_msgspec.TestMessageSimpleArray.testArray3) ... ok testMessageCuPy (test_msgspec.TestMessageRMA.testMessageCuPy) ... skipped 'cupy' testMessageNone (test_msgspec.TestMessageRMA.testMessageNone) ... ok testMessageNumPy (test_msgspec.TestMessageRMA.testMessageNumPy) ... ok testMessageNumba (test_msgspec.TestMessageRMA.testMessageNumba) ... skipped 'numba' testMessageBad (test_msgspec.TestMessageReduce.testMessageBad) ... ok testMessageBad (test_msgspec.TestMessageReduceScatter.testMessageBad) ... ok testMessageBad (test_msgspec.TestMessageSimple.testMessageBad) ... ok testArray3 (test_msgspec.TestMessageSimpleArray.testArray3) ... ok testMessageBottom (test_msgspec.TestMessageSimple.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageSimple.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageSimple.testMessageBytes) ... ok testMessageMemoryView (test_msgspec.TestMessageSimple.testMessageMemoryView) ... ok testMessageNone (test_msgspec.TestMessageSimple.testMessageNone) ... ok testArray1 (test_msgspec.TestMessageSimpleArray.testArray1) ... ok testArray2 (test_msgspec.TestMessageSimpleArray.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleArray.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleArray.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleArray.testArray4) ... ok testArray4 (test_msgspec.TestMessageSimpleArray.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleArray.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleArray.testArray5) ... ok testArray4 (test_msgspec.TestMessageSimpleArray.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleArray.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleArray.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleArray.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleArray.testArray6) ... ok testArray6 (test_msgspec.TestMessageSimpleArray.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleArray.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleArray.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleCAIBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleArray.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleCAIBuf.testArray1) ... ok testBuffer (test_msgspec.TestMessageSimpleArray.testBuffer) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray1 (test_msgspec.TestMessageSimpleCAIBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleCAIBuf.testArray2) ... ok testArray2 (test_msgspec.TestMessageSimpleCAIBuf.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleCAIBuf.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleArray.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleCAIBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleCAIBuf.testArray3) ... ok testArray3 (test_msgspec.TestMessageSimpleCAIBuf.testArray3) ... ok testArray2 (test_msgspec.TestMessageSimpleCAIBuf.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleCAIBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleCAIBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleCAIBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleCAIBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleCAIBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleCAIBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleCAIBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleCAIBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleCAIBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleCAIBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleCAIBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleCAIBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleCAIBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleCAIBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleCAIBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleCuPy.testArray1) ... skipped 'cupy' testArray2 (test_msgspec.TestMessageSimpleCuPy.testArray2) ... skipped 'cupy' testArray3 (test_msgspec.TestMessageSimpleCuPy.testArray3) ... skipped 'cupy' testArray4 (test_msgspec.TestMessageSimpleCuPy.testArray4) ... skipped 'cupy' testArray5 (test_msgspec.TestMessageSimpleCuPy.testArray5) ... skipped 'cupy' testArray6 (test_msgspec.TestMessageSimpleCuPy.testArray6) ... skipped 'cupy' testBuffer (test_msgspec.TestMessageSimpleCuPy.testBuffer) ... skipped 'cupy' testNotContiguous (test_msgspec.TestMessageSimpleCuPy.testNotContiguous) ... skipped 'cupy' testOrderC (test_msgspec.TestMessageSimpleCuPy.testOrderC) ... skipped 'cupy' testOrderFortran (test_msgspec.TestMessageSimpleCuPy.testOrderFortran) ... skipped 'cupy' testArray1 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleCAIBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleCuPy.testArray1) ... skipped 'cupy' testArray2 (test_msgspec.TestMessageSimpleCuPy.testArray2) ... skipped 'cupy' testArray3 (test_msgspec.TestMessageSimpleCuPy.testArray3) ... skipped 'cupy' testArray4 (test_msgspec.TestMessageSimpleCuPy.testArray4) ... skipped 'cupy' testArray5 (test_msgspec.TestMessageSimpleCuPy.testArray5) ... skipped 'cupy' testArray6 (test_msgspec.TestMessageSimpleCuPy.testArray6) ... skipped 'cupy' testBuffer (test_msgspec.TestMessageSimpleCuPy.testBuffer) ... skipped 'cupy' testNotContiguous (test_msgspec.TestMessageSimpleCuPy.testNotContiguous) ... skipped 'cupy' testOrderC (test_msgspec.TestMessageSimpleCuPy.testOrderC) ... skipped 'cupy' testOrderFortran (test_msgspec.TestMessageSimpleCuPy.testOrderFortran) ... skipped 'cupy' [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testArray1 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray2) ... ok testBuffer (test_msgspec.TestMessageSimpleCAIBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleCuPy.testArray1) ... skipped 'cupy' testArray2 (test_msgspec.TestMessageSimpleCuPy.testArray2) ... skipped 'cupy' testArray3 (test_msgspec.TestMessageSimpleCuPy.testArray3) ... skipped 'cupy' testArray4 (test_msgspec.TestMessageSimpleCuPy.testArray4) ... skipped 'cupy' testArray5 (test_msgspec.TestMessageSimpleCuPy.testArray5) ... skipped 'cupy' testArray6 (test_msgspec.TestMessageSimpleCuPy.testArray6) ... skipped 'cupy' testBuffer (test_msgspec.TestMessageSimpleCuPy.testBuffer) ... skipped 'cupy' testNotContiguous (test_msgspec.TestMessageSimpleCuPy.testNotContiguous) ... skipped 'cupy' testOrderC (test_msgspec.TestMessageSimpleCuPy.testOrderC) ... skipped 'cupy' testOrderFortran (test_msgspec.TestMessageSimpleCuPy.testOrderFortran) ... skipped 'cupy' testArray1 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleCAIBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleCuPy.testArray1) ... skipped 'cupy' testArray2 (test_msgspec.TestMessageSimpleCuPy.testArray2) ... skipped 'cupy' testArray3 (test_msgspec.TestMessageSimpleCuPy.testArray3) ... skipped 'cupy' testArray4 (test_msgspec.TestMessageSimpleCuPy.testArray4) ... skipped 'cupy' testArray5 (test_msgspec.TestMessageSimpleCuPy.testArray5) ... skipped 'cupy' testArray6 (test_msgspec.TestMessageSimpleCuPy.testArray6) ... skipped 'cupy' testBuffer (test_msgspec.TestMessageSimpleCuPy.testBuffer) ... skipped 'cupy' testNotContiguous (test_msgspec.TestMessageSimpleCuPy.testNotContiguous) ... skipped 'cupy' testOrderC (test_msgspec.TestMessageSimpleCuPy.testOrderC) ... skipped 'cupy' testOrderFortran (test_msgspec.TestMessageSimpleCuPy.testOrderFortran) ... skipped 'cupy' testArray1 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray3) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testArray2 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackCPUBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackCPUBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackCPUBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackCPUBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackCPUBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackCPUBufV0.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray6) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testArray6 (test_msgspec.TestMessageSimpleDLPackGPUBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackGPUBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackGPUBuf.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackGPUBuf.testBuffer) ... ok testArray4 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray4) ... ok testArray1 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray2) ... ok testArray2 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackGPUBuf.testBuffer) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray1 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray2) ... ok testArray5 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray5) ... ok testArray3 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray5) ... ok testArray5 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleNumPy.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleNumPy.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleNumPy.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleNumPy.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleNumPy.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleNumPy.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleNumPy.testBuffer) ... ok testByteOrder (test_msgspec.TestMessageSimpleNumPy.testByteOrder) ... ok testNotContiguous (test_msgspec.TestMessageSimpleNumPy.testNotContiguous) ... ok testNotWriteable (test_msgspec.TestMessageSimpleNumPy.testNotWriteable) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testOrderC (test_msgspec.TestMessageSimpleNumPy.testOrderC) ... ok testOrderFortran (test_msgspec.TestMessageSimpleNumPy.testOrderFortran) ... ok testReadonly (test_msgspec.TestMessageSimpleNumPy.testReadonly) ... ok testArray1 (test_msgspec.TestMessageSimpleNumba.testArray1) ... skipped 'numba' testArray2 (test_msgspec.TestMessageSimpleNumba.testArray2) ... skipped 'numba' testArray3 (test_msgspec.TestMessageSimpleNumba.testArray3) ... skipped 'numba' testArray4 (test_msgspec.TestMessageSimpleNumba.testArray4) ... skipped 'numba' testArray5 (test_msgspec.TestMessageSimpleNumba.testArray5) ... skipped 'numba' testArray6 (test_msgspec.TestMessageSimpleNumba.testArray6) ... skipped 'numba' testBuffer (test_msgspec.TestMessageSimpleNumba.testBuffer) ... skipped 'numba' testNotContiguous (test_msgspec.TestMessageSimpleNumba.testNotContiguous) ... skipped 'numba' testOrderC (test_msgspec.TestMessageSimpleNumba.testOrderC) ... skipped 'numba' testOrderFortran (test_msgspec.TestMessageSimpleNumba.testOrderFortran) ... skipped 'numba' testMessageBad (test_msgspec.TestMessageVector.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageVector.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageVector.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageVector.testMessageBytes) ... ok testMessageNone (test_msgspec.TestMessageVector.testMessageNone) ... ok testArray1 (test_msgspec.TestMessageVectorArray.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageVectorArray.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageVectorArray.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testBuffer (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleNumPy.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleNumPy.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleNumPy.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleNumPy.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleNumPy.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageVectorArray.testArray4) ... ok testArray3 (test_msgspec.TestMessageSimpleNumPy.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageVectorArray.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleNumPy.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleNumPy.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleNumPy.testArray5) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testArray5 (test_msgspec.TestMessageSimpleNumPy.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageVectorArray.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray1 (test_msgspec.TestMessageVectorCAIBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageVectorCAIBuf.testArray2) ... ok testArray6 (test_msgspec.TestMessageSimpleNumPy.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleNumPy.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageVectorCAIBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageVectorCAIBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageVectorCAIBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleDLPackGPUBufV0.testBuffer) ... ok testArray1 (test_msgspec.TestMessageSimpleNumPy.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageSimpleNumPy.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageSimpleNumPy.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleNumPy.testBuffer) ... ok testByteOrder (test_msgspec.TestMessageSimpleNumPy.testByteOrder) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testNotContiguous (test_msgspec.TestMessageSimpleNumPy.testNotContiguous) ... ok testNotWriteable (test_msgspec.TestMessageSimpleNumPy.testNotWriteable) ... ok testOrderC (test_msgspec.TestMessageSimpleNumPy.testOrderC) ... ok testOrderFortran (test_msgspec.TestMessageSimpleNumPy.testOrderFortran) ... ok testReadonly (test_msgspec.TestMessageSimpleNumPy.testReadonly) ... ok testArray1 (test_msgspec.TestMessageSimpleNumba.testArray1) ... skipped 'numba' testArray2 (test_msgspec.TestMessageSimpleNumba.testArray2) ... skipped 'numba' testArray3 (test_msgspec.TestMessageSimpleNumba.testArray3) ... skipped 'numba' testArray4 (test_msgspec.TestMessageSimpleNumba.testArray4) ... skipped 'numba' testArray5 (test_msgspec.TestMessageSimpleNumba.testArray5) ... skipped 'numba' testArray6 (test_msgspec.TestMessageSimpleNumba.testArray6) ... skipped 'numba' testBuffer (test_msgspec.TestMessageSimpleNumba.testBuffer) ... skipped 'numba' testNotContiguous (test_msgspec.TestMessageSimpleNumba.testNotContiguous) ... skipped 'numba' testOrderC (test_msgspec.TestMessageSimpleNumba.testOrderC) ... skipped 'numba' testOrderFortran (test_msgspec.TestMessageSimpleNumba.testOrderFortran) ... skipped 'numba' testMessageBad (test_msgspec.TestMessageVector.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageVector.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageVector.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageVector.testMessageBytes) ... ok testMessageNone (test_msgspec.TestMessageVector.testMessageNone) ... ok testArray1 (test_msgspec.TestMessageVectorArray.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageVectorArray.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleNumPy.testBuffer) ... ok testByteOrder (test_msgspec.TestMessageSimpleNumPy.testByteOrder) ... ok testNotContiguous (test_msgspec.TestMessageSimpleNumPy.testNotContiguous) ... ok testNotWriteable (test_msgspec.TestMessageSimpleNumPy.testNotWriteable) ... ok testOrderC (test_msgspec.TestMessageSimpleNumPy.testOrderC) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageVectorArray.testArray3) ... testOrderFortran (test_msgspec.TestMessageSimpleNumPy.testOrderFortran) ... ok testReadonly (test_msgspec.TestMessageSimpleNumPy.testReadonly) ... ok testArray1 (test_msgspec.TestMessageSimpleNumba.testArray1) ... skipped 'numba' testArray2 (test_msgspec.TestMessageSimpleNumba.testArray2) ... skipped 'numba' testArray3 (test_msgspec.TestMessageSimpleNumba.testArray3) ... skipped 'numba' testArray4 (test_msgspec.TestMessageSimpleNumba.testArray4) ... skipped 'numba' testArray5 (test_msgspec.TestMessageSimpleNumba.testArray5) ... skipped 'numba' testArray6 (test_msgspec.TestMessageSimpleNumba.testArray6) ... skipped 'numba' testBuffer (test_msgspec.TestMessageSimpleNumba.testBuffer) ... skipped 'numba' testNotContiguous (test_msgspec.TestMessageSimpleNumba.testNotContiguous) ... skipped 'numba' testOrderC (test_msgspec.TestMessageSimpleNumba.testOrderC) ... skipped 'numba' testOrderFortran (test_msgspec.TestMessageSimpleNumba.testOrderFortran) ... skipped 'numba' testMessageBad (test_msgspec.TestMessageVector.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageVector.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageVector.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageVector.testMessageBytes) ... ok testMessageNone (test_msgspec.TestMessageVector.testMessageNone) ... ok testArray1 (test_msgspec.TestMessageVectorArray.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageVectorArray.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageVectorArray.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageVectorArray.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageVectorArray.testArray5) ... ok testArray4 (test_msgspec.TestMessageVectorArray.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageVectorArray.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageSimpleNumPy.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageSimpleNumPy.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageVectorArray.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageVectorArray.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageVectorCAIBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray1 (test_msgspec.TestMessageVectorCAIBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageVectorCAIBuf.testArray2) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testArray1 (test_msgspec.TestMessageVectorCAIBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageVectorCAIBuf.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageVectorCAIBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageVectorCAIBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageSimpleNumPy.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageVectorCAIBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageVectorCAIBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageVectorCAIBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageVectorCAIBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray1 (test_msgspec.TestMessageVectorCuPy.testArray1) ... skipped 'cupy' testArray2 (test_msgspec.TestMessageVectorCuPy.testArray2) ... skipped 'cupy' testArray3 (test_msgspec.TestMessageVectorCuPy.testArray3) ... skipped 'cupy' testArray4 (test_msgspec.TestMessageVectorCuPy.testArray4) ... skipped 'cupy' testArray5 (test_msgspec.TestMessageVectorCuPy.testArray5) ... skipped 'cupy' testArray6 (test_msgspec.TestMessageVectorCuPy.testArray6) ... skipped 'cupy' testArray1 (test_msgspec.TestMessageVectorNumPy.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageVectorNumPy.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageVectorNumPy.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageVectorNumPy.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffer (test_msgspec.TestMessageSimpleNumPy.testBuffer) ... ok testByteOrder (test_msgspec.TestMessageSimpleNumPy.testByteOrder) ... ok testNotContiguous (test_msgspec.TestMessageSimpleNumPy.testNotContiguous) ... ok testNotWriteable (test_msgspec.TestMessageSimpleNumPy.testNotWriteable) ... ok testOrderC (test_msgspec.TestMessageSimpleNumPy.testOrderC) ... ok testOrderFortran (test_msgspec.TestMessageSimpleNumPy.testOrderFortran) ... ok testReadonly (test_msgspec.TestMessageSimpleNumPy.testReadonly) ... ok testArray1 (test_msgspec.TestMessageSimpleNumba.testArray1) ... skipped 'numba' testArray2 (test_msgspec.TestMessageSimpleNumba.testArray2) ... skipped 'numba' testArray3 (test_msgspec.TestMessageSimpleNumba.testArray3) ... skipped 'numba' testArray4 (test_msgspec.TestMessageSimpleNumba.testArray4) ... skipped 'numba' testArray5 (test_msgspec.TestMessageSimpleNumba.testArray5) ... skipped 'numba' testArray6 (test_msgspec.TestMessageSimpleNumba.testArray6) ... skipped 'numba' [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testBuffer (test_msgspec.TestMessageSimpleNumba.testBuffer) ... skipped 'numba' testNotContiguous (test_msgspec.TestMessageSimpleNumba.testNotContiguous) ... skipped 'numba' testOrderC (test_msgspec.TestMessageSimpleNumba.testOrderC) ... skipped 'numba' testOrderFortran (test_msgspec.TestMessageSimpleNumba.testOrderFortran) ... skipped 'numba' testMessageBad (test_msgspec.TestMessageVector.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageVector.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageVector.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageVector.testMessageBytes) ... ok testMessageNone (test_msgspec.TestMessageVector.testMessageNone) ... ok testArray1 (test_msgspec.TestMessageVectorArray.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageVectorNumPy.testArray5) ... ok testArray2 (test_msgspec.TestMessageVectorArray.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageVectorArray.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageVectorArray.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageVectorArray.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageVectorCAIBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageVectorCAIBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageVectorNumPy.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageVectorArray.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray1 (test_msgspec.TestMessageVectorCAIBuf.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageVectorCAIBuf.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray1 (test_msgspec.TestMessageVectorCuPy.testArray1) ... skipped 'cupy' testArray2 (test_msgspec.TestMessageVectorCuPy.testArray2) ... skipped 'cupy' testArray3 (test_msgspec.TestMessageVectorCuPy.testArray3) ... skipped 'cupy' testArray4 (test_msgspec.TestMessageVectorCuPy.testArray4) ... skipped 'cupy' testArray5 (test_msgspec.TestMessageVectorCuPy.testArray5) ... skipped 'cupy' testArray6 (test_msgspec.TestMessageVectorCuPy.testArray6) ... skipped 'cupy' testArray1 (test_msgspec.TestMessageVectorNumPy.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageVectorNumPy.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageVectorCAIBuf.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageVectorNumPy.testArray3) ... ok testArray1 (test_msgspec.TestMessageVectorCuPy.testArray1) ... skipped 'cupy' testArray2 (test_msgspec.TestMessageVectorCuPy.testArray2) ... skipped 'cupy' testArray3 (test_msgspec.TestMessageVectorCuPy.testArray3) ... skipped 'cupy' testArray4 (test_msgspec.TestMessageVectorCuPy.testArray4) ... skipped 'cupy' testArray5 (test_msgspec.TestMessageVectorCuPy.testArray5) ... skipped 'cupy' testArray6 (test_msgspec.TestMessageVectorCuPy.testArray6) ... skipped 'cupy' testArray1 (test_msgspec.TestMessageVectorNumPy.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray2 (test_msgspec.TestMessageVectorNumPy.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageVectorNumPy.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCountNumPyArray (test_msgspec.TestMessageVectorNumPy.testCountNumPyArray) ... ok testCountNumPyScalar (test_msgspec.TestMessageVectorNumPy.testCountNumPyScalar) ... ok testCountNumPyZeroDim (test_msgspec.TestMessageVectorNumPy.testCountNumPyZeroDim) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testArray1 (test_msgspec.TestMessageVectorNumba.testArray1) ... skipped 'numba' testArray2 (test_msgspec.TestMessageVectorNumba.testArray2) ... skipped 'numba' testArray3 (test_msgspec.TestMessageVectorNumba.testArray3) ... skipped 'numba' testArray4 (test_msgspec.TestMessageVectorNumba.testArray4) ... skipped 'numba' testArray5 (test_msgspec.TestMessageVectorNumba.testArray5) ... skipped 'numba' testArray6 (test_msgspec.TestMessageVectorNumba.testArray6) ... skipped 'numba' testMessageArray (test_msgspec.TestMessageVectorW.testMessageArray) ... ok testMessageBad (test_msgspec.TestMessageVectorW.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageVectorW.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageVectorW.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageVectorW.testMessageBytes) ... ok testMessageCAIBuf (test_msgspec.TestMessageVectorW.testMessageCAIBuf) ... ok testMessageCuPy (test_msgspec.TestMessageVectorW.testMessageCuPy) ... skipped 'cupy' testMessageNumPy (test_msgspec.TestMessageVectorW.testMessageNumPy) ... ok testMessageNumba (test_msgspec.TestMessageVectorW.testMessageNumba) ... skipped 'numba' testCollectivesBlock (test_msgzero.TestMessageZeroSelf.testCollectivesBlock) ... ok testCollectivesVector (test_msgzero.TestMessageZeroSelf.testCollectivesVector) ... ok testPointToPoint (test_msgzero.TestMessageZeroSelf.testPointToPoint) ... ok testReductions (test_msgzero.TestMessageZeroSelf.testReductions) ... ok testCollectivesBlock (test_msgzero.TestMessageZeroWorld.testCollectivesBlock) ... ok testCollectivesVector (test_msgzero.TestMessageZeroWorld.testCollectivesVector) ... ok testPointToPoint (test_msgzero.TestMessageZeroWorld.testPointToPoint) ... ok testReductions (test_msgzero.TestMessageZeroWorld.testReductions) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageVectorNumPy.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageVectorNumPy.testArray5) ... testArray4 (test_msgspec.TestMessageVectorNumPy.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageVectorNumPy.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageVectorCAIBuf.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageVectorCAIBuf.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageVectorNumPy.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageVectorNumPy.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageVectorCAIBuf.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCountNumPyArray (test_msgspec.TestMessageVectorNumPy.testCountNumPyArray) ... ok testCountNumPyScalar (test_msgspec.TestMessageVectorNumPy.testCountNumPyScalar) ... ok testCountNumPyZeroDim (test_msgspec.TestMessageVectorNumPy.testCountNumPyZeroDim) ... ok testArray1 (test_msgspec.TestMessageVectorNumba.testArray1) ... skipped 'numba' testArray2 (test_msgspec.TestMessageVectorNumba.testArray2) ... skipped 'numba' testArray3 (test_msgspec.TestMessageVectorNumba.testArray3) ... skipped 'numba' testArray4 (test_msgspec.TestMessageVectorNumba.testArray4) ... skipped 'numba' testArray5 (test_msgspec.TestMessageVectorNumba.testArray5) ... skipped 'numba' testArray6 (test_msgspec.TestMessageVectorNumba.testArray6) ... skipped 'numba' testMessageArray (test_msgspec.TestMessageVectorW.testMessageArray) ... ok testMessageBad (test_msgspec.TestMessageVectorW.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageVectorW.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageVectorW.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageVectorW.testMessageBytes) ... ok testMessageCAIBuf (test_msgspec.TestMessageVectorW.testMessageCAIBuf) ... ok testMessageCuPy (test_msgspec.TestMessageVectorW.testMessageCuPy) ... skipped 'cupy' testMessageNumPy (test_msgspec.TestMessageVectorW.testMessageNumPy) ... ok testMessageNumba (test_msgspec.TestMessageVectorW.testMessageNumba) ... skipped 'numba' testCollectivesBlock (test_msgzero.TestMessageZeroSelf.testCollectivesBlock) ... ok testCollectivesVector (test_msgzero.TestMessageZeroSelf.testCollectivesVector) ... ok testPointToPoint (test_msgzero.TestMessageZeroSelf.testPointToPoint) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testReductions (test_msgzero.TestMessageZeroSelf.testReductions) ... ok testCollectivesBlock (test_msgzero.TestMessageZeroWorld.testCollectivesBlock) ... ok testCollectivesVector (test_msgzero.TestMessageZeroWorld.testCollectivesVector) ... ok testPointToPoint (test_msgzero.TestMessageZeroWorld.testPointToPoint) ... ok testReductions (test_msgzero.TestMessageZeroWorld.testReductions) ... ok testAHandleOf (test_objmodel.TestObjModel.testAHandleOf) ... ok testAddressOf (test_objmodel.TestObjModel.testAddressOf) ... ok testBool (test_objmodel.TestObjModel.testBool) ... ok testCAPI (test_objmodel.TestObjModel.testCAPI) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCmp (test_objmodel.TestObjModel.testCmp) ... ok testConstants (test_objmodel.TestObjModel.testConstants) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testEq (test_objmodel.TestObjModel.testEq) ... ok testHandle (test_objmodel.TestObjModel.testHandle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHash (test_objmodel.TestObjModel.testHash) ... ok testInit (test_objmodel.TestObjModel.testInit) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNe (test_objmodel.TestObjModel.testNe) ... ok testReduce (test_objmodel.TestObjModel.testReduce) ... ok testCountNumPyArray (test_msgspec.TestMessageVectorNumPy.testCountNumPyArray) ... ok testCountNumPyScalar (test_msgspec.TestMessageVectorNumPy.testCountNumPyScalar) ... ok testCountNumPyZeroDim (test_msgspec.TestMessageVectorNumPy.testCountNumPyZeroDim) ... ok testArray1 (test_msgspec.TestMessageVectorNumba.testArray1) ... skipped 'numba' testArray2 (test_msgspec.TestMessageVectorNumba.testArray2) ... skipped 'numba' testArray3 (test_msgspec.TestMessageVectorNumba.testArray3) ... skipped 'numba' testArray4 (test_msgspec.TestMessageVectorNumba.testArray4) ... skipped 'numba' testArray5 (test_msgspec.TestMessageVectorNumba.testArray5) ... skipped 'numba' testArray6 (test_msgspec.TestMessageVectorNumba.testArray6) ... skipped 'numba' testMessageArray (test_msgspec.TestMessageVectorW.testMessageArray) ... ok testMessageBad (test_msgspec.TestMessageVectorW.testMessageBad) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testMessageBottom (test_msgspec.TestMessageVectorW.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageVectorW.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageVectorW.testMessageBytes) ... ok testMessageCAIBuf (test_msgspec.TestMessageVectorW.testMessageCAIBuf) ... ok testMessageCuPy (test_msgspec.TestMessageVectorW.testMessageCuPy) ... skipped 'cupy' testMessageNumPy (test_msgspec.TestMessageVectorW.testMessageNumPy) ... ok testMessageNumba (test_msgspec.TestMessageVectorW.testMessageNumba) ... skipped 'numba' testCollectivesBlock (test_msgzero.TestMessageZeroSelf.testCollectivesBlock) ... ok testCollectivesVector (test_msgzero.TestMessageZeroSelf.testCollectivesVector) ... ok testPointToPoint (test_msgzero.TestMessageZeroSelf.testPointToPoint) ... ok testReductions (test_msgzero.TestMessageZeroSelf.testReductions) ... ok testCollectivesBlock (test_msgzero.TestMessageZeroWorld.testCollectivesBlock) ... ok testCollectivesVector (test_msgzero.TestMessageZeroWorld.testCollectivesVector) ... ok testPointToPoint (test_msgzero.TestMessageZeroWorld.testPointToPoint) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testReductions (test_msgzero.TestMessageZeroWorld.testReductions) ... ok testAHandleOf (test_objmodel.TestObjModel.testAHandleOf) ... ok testAddressOf (test_objmodel.TestObjModel.testAddressOf) ... ok testBool (test_objmodel.TestObjModel.testBool) ... ok testCAPI (test_objmodel.TestObjModel.testCAPI) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCmp (test_objmodel.TestObjModel.testCmp) ... ok testConstants (test_objmodel.TestObjModel.testConstants) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testEq (test_objmodel.TestObjModel.testEq) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHandle (test_objmodel.TestObjModel.testHandle) ... ok testHash (test_objmodel.TestObjModel.testHash) ... ok testInit (test_objmodel.TestObjModel.testInit) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNe (test_objmodel.TestObjModel.testNe) ... ok testReduce (test_objmodel.TestObjModel.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSafeFreeConstant (test_objmodel.TestObjModel.testSafeFreeConstant) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSafeFreeConstant (test_objmodel.TestObjModel.testSafeFreeConstant) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray1 (test_msgspec.TestMessageVectorCuPy.testArray1) ... skipped 'cupy' testArray2 (test_msgspec.TestMessageVectorCuPy.testArray2) ... skipped 'cupy' testArray3 (test_msgspec.TestMessageVectorCuPy.testArray3) ... skipped 'cupy' testArray4 (test_msgspec.TestMessageVectorCuPy.testArray4) ... skipped 'cupy' testArray5 (test_msgspec.TestMessageVectorCuPy.testArray5) ... skipped 'cupy' testArray6 (test_msgspec.TestMessageVectorCuPy.testArray6) ... skipped 'cupy' testArray1 (test_msgspec.TestMessageVectorNumPy.testArray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testArray2 (test_msgspec.TestMessageVectorNumPy.testArray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray3 (test_msgspec.TestMessageVectorNumPy.testArray3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray4 (test_msgspec.TestMessageVectorNumPy.testArray4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray5 (test_msgspec.TestMessageVectorNumPy.testArray5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArray6 (test_msgspec.TestMessageVectorNumPy.testArray6) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCountNumPyArray (test_msgspec.TestMessageVectorNumPy.testCountNumPyArray) ... ok testCountNumPyScalar (test_msgspec.TestMessageVectorNumPy.testCountNumPyScalar) ... ok testCountNumPyZeroDim (test_msgspec.TestMessageVectorNumPy.testCountNumPyZeroDim) ... ok testArray1 (test_msgspec.TestMessageVectorNumba.testArray1) ... skipped 'numba' testArray2 (test_msgspec.TestMessageVectorNumba.testArray2) ... skipped 'numba' testArray3 (test_msgspec.TestMessageVectorNumba.testArray3) ... skipped 'numba' testArray4 (test_msgspec.TestMessageVectorNumba.testArray4) ... skipped 'numba' testArray5 (test_msgspec.TestMessageVectorNumba.testArray5) ... skipped 'numba' testArray6 (test_msgspec.TestMessageVectorNumba.testArray6) ... skipped 'numba' testMessageArray (test_msgspec.TestMessageVectorW.testMessageArray) ... ok testMessageBad (test_msgspec.TestMessageVectorW.testMessageBad) ... ok testMessageBottom (test_msgspec.TestMessageVectorW.testMessageBottom) ... ok testMessageBytearray (test_msgspec.TestMessageVectorW.testMessageBytearray) ... ok testMessageBytes (test_msgspec.TestMessageVectorW.testMessageBytes) ... ok testMessageCAIBuf (test_msgspec.TestMessageVectorW.testMessageCAIBuf) ... ok testMessageCuPy (test_msgspec.TestMessageVectorW.testMessageCuPy) ... skipped 'cupy' testMessageNumPy (test_msgspec.TestMessageVectorW.testMessageNumPy) ... ok testMessageNumba (test_msgspec.TestMessageVectorW.testMessageNumba) ... skipped 'numba' testCollectivesBlock (test_msgzero.TestMessageZeroSelf.testCollectivesBlock) ... ok testCollectivesVector (test_msgzero.TestMessageZeroSelf.testCollectivesVector) ... ok testPointToPoint (test_msgzero.TestMessageZeroSelf.testPointToPoint) ... ok testReductions (test_msgzero.TestMessageZeroSelf.testReductions) ... ok testCollectivesBlock (test_msgzero.TestMessageZeroWorld.testCollectivesBlock) ... ok testCollectivesVector (test_msgzero.TestMessageZeroWorld.testCollectivesVector) ... ok testPointToPoint (test_msgzero.TestMessageZeroWorld.testPointToPoint) ... ok testReductions (test_msgzero.TestMessageZeroWorld.testReductions) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAHandleOf (test_objmodel.TestObjModel.testAHandleOf) ... ok testAddressOf (test_objmodel.TestObjModel.testAddressOf) ... ok testBool (test_objmodel.TestObjModel.testBool) ... ok testCAPI (test_objmodel.TestObjModel.testCAPI) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAHandleOf (test_objmodel.TestObjModel.testAHandleOf) ... ok testAddressOf (test_objmodel.TestObjModel.testAddressOf) ... ok testBool (test_objmodel.TestObjModel.testBool) ... ok testCAPI (test_objmodel.TestObjModel.testCAPI) ... ok testCmp (test_objmodel.TestObjModel.testCmp) ... ok testConstants (test_objmodel.TestObjModel.testConstants) ... ok testCmp (test_objmodel.TestObjModel.testCmp) ... ok testConstants (test_objmodel.TestObjModel.testConstants) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testEq (test_objmodel.TestObjModel.testEq) ... ok testHandle (test_objmodel.TestObjModel.testHandle) ... ok testHash (test_objmodel.TestObjModel.testHash) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testEq (test_objmodel.TestObjModel.testEq) ... ok testHandle (test_objmodel.TestObjModel.testHandle) ... ok testInit (test_objmodel.TestObjModel.testInit) ... ok testNe (test_objmodel.TestObjModel.testNe) ... ok testReduce (test_objmodel.TestObjModel.testReduce) ... ok testHash (test_objmodel.TestObjModel.testHash) ... ok testInit (test_objmodel.TestObjModel.testInit) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNe (test_objmodel.TestObjModel.testNe) ... ok testReduce (test_objmodel.TestObjModel.testReduce) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSafeFreeConstant (test_objmodel.TestObjModel.testSafeFreeConstant) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSafeFreeConstant (test_objmodel.TestObjModel.testSafeFreeConstant) ... ok testSafeFreeCreated (test_objmodel.TestObjModel.testSafeFreeCreated) ... ok testSafeFreeNull (test_objmodel.TestObjModel.testSafeFreeNull) ... ok testSizeOf (test_objmodel.TestObjModel.testSizeOf) ... ok testWeakRef (test_objmodel.TestObjModel.testWeakRef) ... ok testCall (test_op.TestOp.testCall) ... ok testConstructor (test_op.TestOp.testConstructor) ... ok testCreate (test_op.TestOp.testCreate) ... ok testSafeFreeCreated (test_objmodel.TestObjModel.testSafeFreeCreated) ... ok testSafeFreeNull (test_objmodel.TestObjModel.testSafeFreeNull) ... ok testSizeOf (test_objmodel.TestObjModel.testSizeOf) ... ok testWeakRef (test_objmodel.TestObjModel.testWeakRef) ... ok testCall (test_op.TestOp.testCall) ... ok testConstructor (test_op.TestOp.testConstructor) ... ok testCreate (test_op.TestOp.testCreate) ... ok testSafeFreeCreated (test_objmodel.TestObjModel.testSafeFreeCreated) ... ok testSafeFreeNull (test_objmodel.TestObjModel.testSafeFreeNull) ... ok testSizeOf (test_objmodel.TestObjModel.testSizeOf) ... ok testWeakRef (test_objmodel.TestObjModel.testWeakRef) ... ok testCall (test_op.TestOp.testCall) ... ok testConstructor (test_op.TestOp.testConstructor) ... ok testCreate (test_op.TestOp.testCreate) ... ok testSafeFreeCreated (test_objmodel.TestObjModel.testSafeFreeCreated) ... ok testSafeFreeNull (test_objmodel.TestObjModel.testSafeFreeNull) ... ok testSizeOf (test_objmodel.TestObjModel.testSizeOf) ... ok testWeakRef (test_objmodel.TestObjModel.testWeakRef) ... ok testCall (test_op.TestOp.testCall) ... ok testConstructor (test_op.TestOp.testConstructor) ... ok testCreate (test_op.TestOp.testCreate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCreateMany (test_op.TestOp.testCreateMany) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCreateMany (test_op.TestOp.testCreateMany) ... ok ok testCreateMany (test_op.TestOp.testCreateMany) ... ok testCreateMany (test_op.TestOp.testCreateMany) ... ok testIsCommutative (test_op.TestOp.testIsCommutative) ... testIsCommutative (test_op.TestOp.testIsCommutative) ... ok testIsCommutativeExtra (test_op.TestOp.testIsCommutativeExtra) ... ok testIsPredefined (test_op.TestOp.testIsPredefined) ... ok testMinMax (test_op.TestOp.testMinMax) ... ok testMinMaxLoc (test_op.TestOp.testMinMaxLoc) ... ok testPicklePredefined (test_op.TestOp.testPicklePredefined) ... testIsCommutative (test_op.TestOp.testIsCommutative) ... ok testIsCommutativeExtra (test_op.TestOp.testIsCommutativeExtra) ... ok testIsPredefined (test_op.TestOp.testIsPredefined) ... ok testMinMax (test_op.TestOp.testMinMax) ... ok testMinMaxLoc (test_op.TestOp.testMinMaxLoc) ... ok testPicklePredefined (test_op.TestOp.testPicklePredefined) ... ok testIsCommutative (test_op.TestOp.testIsCommutative) ... ok testIsCommutativeExtra (test_op.TestOp.testIsCommutativeExtra) ... ok testIsPredefined (test_op.TestOp.testIsPredefined) ... ok testMinMax (test_op.TestOp.testMinMax) ... ok testMinMaxLoc (test_op.TestOp.testMinMaxLoc) ... ok testPicklePredefined (test_op.TestOp.testPicklePredefined) ... ok testIsCommutativeExtra (test_op.TestOp.testIsCommutativeExtra) ... ok testIsPredefined (test_op.TestOp.testIsPredefined) ... ok testMinMax (test_op.TestOp.testMinMax) ... ok testMinMaxLoc (test_op.TestOp.testMinMaxLoc) ... ok testPicklePredefined (test_op.TestOp.testPicklePredefined) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPickleUserDefined (test_op.TestOp.testPickleUserDefined) ... ok testConstructor (test_p2p_buf.TestP2PBufSelf.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufSelf.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufSelf.testISendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPickleUserDefined (test_op.TestOp.testPickleUserDefined) ... ok testConstructor (test_p2p_buf.TestP2PBufSelf.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufSelf.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufSelf.testISendrecv) ... ok testPickleUserDefined (test_op.TestOp.testPickleUserDefined) ... ok testConstructor (test_p2p_buf.TestP2PBufSelf.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufSelf.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufSelf.testISendrecv) ... ok testPickleUserDefined (test_op.TestOp.testPickleUserDefined) ... ok testConstructor (test_p2p_buf.TestP2PBufSelf.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufSelf.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufSelf.testISendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendrecvReplace (test_p2p_buf.TestP2PBufSelf.testISendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendrecvReplace (test_p2p_buf.TestP2PBufSelf.testISendrecvReplace) ... ok testISendrecvReplace (test_p2p_buf.TestP2PBufSelf.testISendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendrecvReplace (test_p2p_buf.TestP2PBufSelf.testISendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPersistent (test_p2p_buf.TestP2PBufSelf.testPersistent) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPersistent (test_p2p_buf.TestP2PBufSelf.testPersistent) ... ok testPersistent (test_p2p_buf.TestP2PBufSelf.testPersistent) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPersistent (test_p2p_buf.TestP2PBufSelf.testPersistent) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProbe (test_p2p_buf.TestP2PBufSelf.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufSelf.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufSelf.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufSelf.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufSelf.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufSelf.testSendRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProbe (test_p2p_buf.TestP2PBufSelf.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufSelf.testProbeCancel) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProcNull (test_p2p_buf.TestP2PBufSelf.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufSelf.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufSelf.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufSelf.testSendRecv) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testProbe (test_p2p_buf.TestP2PBufSelf.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufSelf.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufSelf.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufSelf.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufSelf.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufSelf.testSendRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufSelf.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufSelf.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProbe (test_p2p_buf.TestP2PBufSelf.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufSelf.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufSelf.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufSelf.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufSelf.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufSelf.testSendRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufSelf.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecvReplace (test_p2p_buf.TestP2PBufSelf.testSendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_p2p_buf.TestP2PBufSelfDup.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufSelfDup.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufSelfDup.testISendrecv) ... testSendrecvReplace (test_p2p_buf.TestP2PBufSelf.testSendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecvReplace (test_p2p_buf.TestP2PBufSelf.testSendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_p2p_buf.TestP2PBufSelfDup.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufSelfDup.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufSelfDup.testISendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufSelf.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_p2p_buf.TestP2PBufSelfDup.testConstructor) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendrecvReplace (test_p2p_buf.TestP2PBufSelfDup.testISendrecvReplace) ... ok testIProbe (test_p2p_buf.TestP2PBufSelfDup.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufSelfDup.testISendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPersistent (test_p2p_buf.TestP2PBufSelfDup.testPersistent) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecvReplace (test_p2p_buf.TestP2PBufSelf.testSendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendrecvReplace (test_p2p_buf.TestP2PBufSelfDup.testISendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendrecvReplace (test_p2p_buf.TestP2PBufSelfDup.testISendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_p2p_buf.TestP2PBufSelfDup.testConstructor) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIProbe (test_p2p_buf.TestP2PBufSelfDup.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufSelfDup.testISendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPersistent (test_p2p_buf.TestP2PBufSelfDup.testPersistent) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPersistent (test_p2p_buf.TestP2PBufSelfDup.testPersistent) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendrecvReplace (test_p2p_buf.TestP2PBufSelfDup.testISendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPersistent (test_p2p_buf.TestP2PBufSelfDup.testPersistent) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProbe (test_p2p_buf.TestP2PBufSelfDup.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufSelfDup.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufSelfDup.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufSelfDup.testProcNullISendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProcNullPersistent (test_p2p_buf.TestP2PBufSelfDup.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufSelfDup.testSendRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufSelfDup.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProbe (test_p2p_buf.TestP2PBufSelfDup.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufSelfDup.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufSelfDup.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufSelfDup.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufSelfDup.testProcNullPersistent) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendRecv (test_p2p_buf.TestP2PBufSelfDup.testSendRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecvReplace (test_p2p_buf.TestP2PBufSelfDup.testSendrecvReplace) ... ok testProbe (test_p2p_buf.TestP2PBufSelfDup.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufSelfDup.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufSelfDup.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufSelfDup.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufSelfDup.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufSelfDup.testSendRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProbe (test_p2p_buf.TestP2PBufSelfDup.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufSelfDup.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufSelfDup.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufSelfDup.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufSelfDup.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufSelfDup.testSendRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_p2p_buf.TestP2PBufWorld.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufWorld.testIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufSelfDup.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufSelfDup.testSendrecv) ... ok testSendrecv (test_p2p_buf.TestP2PBufSelfDup.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecvReplace (test_p2p_buf.TestP2PBufSelfDup.testSendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecvReplace (test_p2p_buf.TestP2PBufSelfDup.testSendrecvReplace) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testSendrecvReplace (test_p2p_buf.TestP2PBufSelfDup.testSendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_p2p_buf.TestP2PBufWorld.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufWorld.testIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_p2p_buf.TestP2PBufWorld.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufWorld.testIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendrecv (test_p2p_buf.TestP2PBufWorld.testISendrecv) ... ok testISendrecv (test_p2p_buf.TestP2PBufWorld.testISendrecv) ... ok testISendrecv (test_p2p_buf.TestP2PBufWorld.testISendrecv) ... ok testConstructor (test_p2p_buf.TestP2PBufWorld.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufWorld.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufWorld.testISendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendrecvReplace (test_p2p_buf.TestP2PBufWorld.testISendrecvReplace) ... ok testISendrecvReplace (test_p2p_buf.TestP2PBufWorld.testISendrecvReplace) ... ok testISendrecvReplace (test_p2p_buf.TestP2PBufWorld.testISendrecvReplace) ... ok testISendrecvReplace (test_p2p_buf.TestP2PBufWorld.testISendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPersistent (test_p2p_buf.TestP2PBufWorld.testPersistent) ... ok testPersistent (test_p2p_buf.TestP2PBufWorld.testPersistent) ... ok testPersistent (test_p2p_buf.TestP2PBufWorld.testPersistent) ... ok testPersistent (test_p2p_buf.TestP2PBufWorld.testPersistent) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProbe (test_p2p_buf.TestP2PBufWorld.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufWorld.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufWorld.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufWorld.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufWorld.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufWorld.testSendRecv) ... ok testProbe (test_p2p_buf.TestP2PBufWorld.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufWorld.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufWorld.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufWorld.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufWorld.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufWorld.testSendRecv) ... ok testProbe (test_p2p_buf.TestP2PBufWorld.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufWorld.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufWorld.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufWorld.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufWorld.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufWorld.testSendRecv) ... ok testProbe (test_p2p_buf.TestP2PBufWorld.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufWorld.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufWorld.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufWorld.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufWorld.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufWorld.testSendRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufWorld.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufWorld.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufWorld.testSendrecv) ... ok testSendrecv (test_p2p_buf.TestP2PBufWorld.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecvReplace (test_p2p_buf.TestP2PBufWorld.testSendrecvReplace) ... ok testSendrecvReplace (test_p2p_buf.TestP2PBufWorld.testSendrecvReplace) ... ok testSendrecvReplace (test_p2p_buf.TestP2PBufWorld.testSendrecvReplace) ... ok testSendrecvReplace (test_p2p_buf.TestP2PBufWorld.testSendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_p2p_buf.TestP2PBufWorldDup.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufWorldDup.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufWorldDup.testISendrecv) ... ok testConstructor (test_p2p_buf.TestP2PBufWorldDup.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufWorldDup.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufWorldDup.testISendrecv) ... ok testConstructor (test_p2p_buf.TestP2PBufWorldDup.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufWorldDup.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufWorldDup.testISendrecv) ... ok testConstructor (test_p2p_buf.TestP2PBufWorldDup.testConstructor) ... ok testIProbe (test_p2p_buf.TestP2PBufWorldDup.testIProbe) ... ok testISendrecv (test_p2p_buf.TestP2PBufWorldDup.testISendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendrecvReplace (test_p2p_buf.TestP2PBufWorldDup.testISendrecvReplace) ... ok testISendrecvReplace (test_p2p_buf.TestP2PBufWorldDup.testISendrecvReplace) ... ok testISendrecvReplace (test_p2p_buf.TestP2PBufWorldDup.testISendrecvReplace) ... ok testISendrecvReplace (test_p2p_buf.TestP2PBufWorldDup.testISendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPersistent (test_p2p_buf.TestP2PBufWorldDup.testPersistent) ... ok testPersistent (test_p2p_buf.TestP2PBufWorldDup.testPersistent) ... ok testPersistent (test_p2p_buf.TestP2PBufWorldDup.testPersistent) ... ok testPersistent (test_p2p_buf.TestP2PBufWorldDup.testPersistent) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProbe (test_p2p_buf.TestP2PBufWorldDup.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufWorldDup.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufWorldDup.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufWorldDup.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufWorldDup.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufWorldDup.testSendRecv) ... ok testProbe (test_p2p_buf.TestP2PBufWorldDup.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufWorldDup.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufWorldDup.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufWorldDup.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufWorldDup.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufWorldDup.testSendRecv) ... ok testProbe (test_p2p_buf.TestP2PBufWorldDup.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufWorldDup.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufWorldDup.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufWorldDup.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufWorldDup.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufWorldDup.testSendRecv) ... ok testProbe (test_p2p_buf.TestP2PBufWorldDup.testProbe) ... ok testProbeCancel (test_p2p_buf.TestP2PBufWorldDup.testProbeCancel) ... ok testProcNull (test_p2p_buf.TestP2PBufWorldDup.testProcNull) ... ok testProcNullISendrecv (test_p2p_buf.TestP2PBufWorldDup.testProcNullISendrecv) ... ok testProcNullPersistent (test_p2p_buf.TestP2PBufWorldDup.testProcNullPersistent) ... ok testSendRecv (test_p2p_buf.TestP2PBufWorldDup.testSendRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufWorldDup.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufWorldDup.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_buf.TestP2PBufWorldDup.testSendrecv) ... ok testSendrecv (test_p2p_buf.TestP2PBufWorldDup.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecvReplace (test_p2p_buf.TestP2PBufWorldDup.testSendrecvReplace) ... ok testSendrecvReplace (test_p2p_buf.TestP2PBufWorldDup.testSendrecvReplace) ... ok testSendrecvReplace (test_p2p_buf.TestP2PBufWorldDup.testSendrecvReplace) ... ok testSendrecvReplace (test_p2p_buf.TestP2PBufWorldDup.testSendrecvReplace) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessageNoProc (test_p2p_buf_matched.TestMessage.testMessageNoProc) ... ok testMessageNull (test_p2p_buf_matched.TestMessage.testMessageNull) ... ok testPickle (test_p2p_buf_matched.TestMessage.testPickle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessageNoProc (test_p2p_buf_matched.TestMessage.testMessageNoProc) ... ok testMessageNull (test_p2p_buf_matched.TestMessage.testMessageNull) ... ok testPickle (test_p2p_buf_matched.TestMessage.testPickle) ... ok testMessageNoProc (test_p2p_buf_matched.TestMessage.testMessageNoProc) ... ok testMessageNull (test_p2p_buf_matched.TestMessage.testMessageNull) ... ok testPickle (test_p2p_buf_matched.TestMessage.testPickle) ... ok testMessageNoProc (test_p2p_buf_matched.TestMessage.testMessageNoProc) ... ok testMessageNull (test_p2p_buf_matched.TestMessage.testMessageNull) ... ok testPickle (test_p2p_buf_matched.TestMessage.testPickle) ... ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedSelf.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedSelf.testProbeRecv) ... ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedSelf.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedSelf.testProbeRecv) ... ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedSelf.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedSelf.testProbeRecv) ... ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedSelf.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedSelf.testProbeRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedSelfDup.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedSelfDup.testProbeRecv) ... ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedSelfDup.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedSelfDup.testProbeRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedSelfDup.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedSelfDup.testProbeRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedSelfDup.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedSelfDup.testProbeRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedWorld.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedWorld.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedWorld.testProbeRecv) ... ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedWorld.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedWorld.testProbeRecv) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedWorld.testProbeRecv) ... ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedWorld.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedWorld.testProbeRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedWorldDup.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedWorldDup.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_p2p_buf_matched.TestP2PMatchedWorldDup.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedWorldDup.testProbeRecv) ... testIMProbe (test_p2p_buf_matched.TestP2PMatchedWorldDup.testIMProbe) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedWorldDup.testProbeRecv) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedWorldDup.testProbeRecv) ... ok testProbeRecv (test_p2p_buf_matched.TestP2PMatchedWorldDup.testProbeRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRing (test_p2p_buf_part.TestP2PBufPartSelf.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartSelf.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartSelf.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartSelfDup.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartSelfDup.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartSelfDup.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartWorld.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartWorld.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartWorld.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartWorldDup.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartWorldDup.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartWorldDup.testSelf) ... skipped 'mpi-p2p-part' testCancel (test_p2p_obj.TestP2PObjSelf.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjSelf.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndIBSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndISSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjSelf.testIRecvAndISend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRing (test_p2p_buf_part.TestP2PBufPartSelf.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartSelf.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartSelf.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartSelfDup.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartSelfDup.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartSelfDup.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartWorld.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartWorld.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartWorld.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartWorldDup.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartWorldDup.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartWorldDup.testSelf) ... skipped 'mpi-p2p-part' testCancel (test_p2p_obj.TestP2PObjSelf.testCancel) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndSSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndSend) ... ok testCommLock (test_p2p_obj.TestP2PObjSelf.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndIBSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndISSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjSelf.testIRecvAndISend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_p2p_obj.TestP2PObjSelf.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjSelf.testISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjSelf.testManyISSendAndRecv) ... ok testIRecvAndSSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndSSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testManyISendAndRecv (test_p2p_obj.TestP2PObjSelf.testManyISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndSend) ... ok testMixed (test_p2p_obj.TestP2PObjSelf.testMixed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPingPong01 (test_p2p_obj.TestP2PObjSelf.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjSelf.testProbe) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjSelf.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjSelf.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjSelf.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjSelf.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjSelf.testTestSomeRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_p2p_obj.TestP2PObjSelf.testISSendAndRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjSelf.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjSelf.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjSelf.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjSelfDup.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjSelfDup.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndBSend) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjSelf.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjSelf.testManyISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndIBSend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndISSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndISend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndISend) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjSelf.testManyISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndSSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMixed (test_p2p_obj.TestP2PObjSelf.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjSelf.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjSelf.testProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testISSendAndRecv) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjSelf.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjSelf.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjSelf.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjSelf.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjSelf.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjSelf.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjSelf.testWaitSomeRecv) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testISendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testManyISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitSomeSend (test_p2p_obj.TestP2PObjSelf.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjSelfDup.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjSelfDup.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndBSend) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testManyISendAndRecv) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndIBSend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndISSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndISend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndISend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMixed (test_p2p_obj.TestP2PObjSelfDup.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjSelfDup.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjSelfDup.testProbe) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjSelfDup.testRecvObjArg) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjSelfDup.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjSelfDup.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjSelfDup.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjSelfDup.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjSelfDup.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjWorld.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjWorld.testCommLock) ... ok testIRecvAndSSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testISSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testManyISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testManyISendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testManyISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMixed (test_p2p_obj.TestP2PObjSelfDup.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjSelfDup.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjSelfDup.testProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRecvObjArg (test_p2p_obj.TestP2PObjSelfDup.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjSelfDup.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjSelfDup.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjSelfDup.testTestSomeSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitSomeRecv (test_p2p_obj.TestP2PObjSelfDup.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjSelfDup.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjWorld.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjWorld.testCommLock) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRing (test_p2p_buf_part.TestP2PBufPartSelf.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartSelf.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartSelf.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartSelfDup.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartSelfDup.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartSelfDup.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartWorld.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartWorld.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartWorld.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartWorldDup.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartWorldDup.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartWorldDup.testSelf) ... skipped 'mpi-p2p-part' testCancel (test_p2p_obj.TestP2PObjSelf.testCancel) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRing (test_p2p_buf_part.TestP2PBufPartSelf.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartSelf.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartSelf.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartSelfDup.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartSelfDup.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartSelfDup.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartWorld.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartWorld.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartWorld.testSelf) ... skipped 'mpi-p2p-part' testRing (test_p2p_buf_part.TestP2PBufPartWorldDup.testRing) ... skipped 'mpi-p2p-part' testRingRangeList (test_p2p_buf_part.TestP2PBufPartWorldDup.testRingRangeList) ... skipped 'mpi-p2p-part' testSelf (test_p2p_buf_part.TestP2PBufPartWorldDup.testSelf) ... skipped 'mpi-p2p-part' testCancel (test_p2p_obj.TestP2PObjSelf.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjSelf.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndIBSend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndISSend) ... ok testCommLock (test_p2p_obj.TestP2PObjSelf.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndIBSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjSelf.testIRecvAndISend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjSelf.testIRecvAndISend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndSSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndSSend) ... ok testIRecvAndSSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndSSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndSend (test_p2p_obj.TestP2PObjSelf.testIRecvAndSend) ... ok testISSendAndRecv (test_p2p_obj.TestP2PObjSelf.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_p2p_obj.TestP2PObjSelf.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendAndRecv (test_p2p_obj.TestP2PObjSelf.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjSelf.testManyISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjSelf.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjSelf.testManyISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testManyISendAndRecv (test_p2p_obj.TestP2PObjSelf.testManyISendAndRecv) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjSelf.testManyISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMixed (test_p2p_obj.TestP2PObjSelf.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjSelf.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjSelf.testProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMixed (test_p2p_obj.TestP2PObjSelf.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjSelf.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjSelf.testProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRecvObjArg (test_p2p_obj.TestP2PObjSelf.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjSelf.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjSelf.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjSelf.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjSelf.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjSelf.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjSelf.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjSelf.testWaitSomeSend) ... ok ok testRecvObjArg (test_p2p_obj.TestP2PObjSelf.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjSelf.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjSelf.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjSelf.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjSelf.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjSelf.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjSelf.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjSelf.testWaitSomeSend) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCancel (test_p2p_obj.TestP2PObjSelfDup.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjSelfDup.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndIBSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCancel (test_p2p_obj.TestP2PObjSelfDup.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjSelfDup.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndIBSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndISSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndISend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndISend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndSSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndSend) ... ok testIRecvAndSSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjSelfDup.testIRecvAndSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testManyISSendAndRecv) ... ok testISSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testManyISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testManyISendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testManyISendAndRecv) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testManyISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMixed (test_p2p_obj.TestP2PObjSelfDup.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjSelfDup.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjSelfDup.testProbe) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjSelfDup.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjSelfDup.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjSelfDup.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjSelfDup.testTestSomeSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMixed (test_p2p_obj.TestP2PObjSelfDup.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjSelfDup.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjSelfDup.testProbe) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjSelfDup.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjSelfDup.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjSelfDup.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjSelfDup.testTestSomeRecv) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjSelfDup.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjSelfDup.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjWorld.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjWorld.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndIBSend) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjSelfDup.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjSelfDup.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjSelfDup.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjWorld.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjWorld.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndBSend) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndBSend) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndBSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndISSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjWorld.testIRecvAndISend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndIBSend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndISSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndIBSend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndISSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndIBSend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjWorld.testIRecvAndISend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjWorld.testIRecvAndISend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjWorld.testIRecvAndISend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndSSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndSSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndSSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndSSend) ... ok testIRecvAndSSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndSSend) ... ok testIRecvAndSSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjWorld.testIRecvAndSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_p2p_obj.TestP2PObjWorld.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendAndRecv (test_p2p_obj.TestP2PObjWorld.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjWorld.testManyISSendAndRecv) ... ok testISSendAndRecv (test_p2p_obj.TestP2PObjWorld.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjWorld.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjWorld.testManyISSendAndRecv) ... ok testISSendAndRecv (test_p2p_obj.TestP2PObjWorld.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjWorld.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjWorld.testManyISSendAndRecv) ... ok testISSendAndRecv (test_p2p_obj.TestP2PObjWorld.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjWorld.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjWorld.testManyISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testManyISendAndRecv (test_p2p_obj.TestP2PObjWorld.testManyISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testManyISendAndRecv (test_p2p_obj.TestP2PObjWorld.testManyISendAndRecv) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjWorld.testManyISendAndRecv) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjWorld.testManyISendAndRecv) ... ok testMixed (test_p2p_obj.TestP2PObjWorld.testMixed) ... ok testMixed (test_p2p_obj.TestP2PObjWorld.testMixed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPingPong01 (test_p2p_obj.TestP2PObjWorld.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjWorld.testProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMixed (test_p2p_obj.TestP2PObjWorld.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjWorld.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjWorld.testProbe) ... ok testPingPong01 (test_p2p_obj.TestP2PObjWorld.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjWorld.testProbe) ... ok testMixed (test_p2p_obj.TestP2PObjWorld.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjWorld.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjWorld.testProbe) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjWorld.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjWorld.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjWorld.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjWorld.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjWorld.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjWorld.testTestSomeSend) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjWorld.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjWorld.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjWorld.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjWorld.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjWorld.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjWorld.testTestSomeSend) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjWorld.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjWorld.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjWorld.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjWorld.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjWorld.testTestSomeRecv) ... ok ok testRecvObjArg (test_p2p_obj.TestP2PObjWorld.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjWorld.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjWorld.testSendAndRecv) ... ok testSendrecv (test_p2p_obj.TestP2PObjWorld.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjWorld.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjWorld.testTestSomeSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitSomeRecv (test_p2p_obj.TestP2PObjWorld.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjWorld.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjWorldDup.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjWorldDup.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndBSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjWorld.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjWorld.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjWorldDup.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjWorldDup.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndBSend) ... testTestSomeSend (test_p2p_obj.TestP2PObjWorld.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjWorld.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjWorld.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjWorldDup.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjWorldDup.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndBSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjWorld.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjWorld.testWaitSomeSend) ... ok testCancel (test_p2p_obj.TestP2PObjWorldDup.testCancel) ... ok testCommLock (test_p2p_obj.TestP2PObjWorldDup.testCommLock) ... ok testIRecvAndBSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndBSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndIBSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndIBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndIBSend) ... ok testIRecvAndIBSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndIBSend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndISend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndISend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndISend) ... ok testIRecvAndISSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndISSend) ... ok testIRecvAndISend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndISend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIRecvAndSSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndSend) ... testIRecvAndSSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndSend) ... ok testIRecvAndSSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndSend) ... ok testIRecvAndSSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndSSend) ... ok testIRecvAndSend (test_p2p_obj.TestP2PObjWorldDup.testIRecvAndSend) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testISendAndRecv) ... ok testISSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testISendAndRecv) ... ok testISSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testISSendAndRecv) ... ok testISendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testISendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testManyISSendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testManyISSendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testManyISSendAndRecv) ... ok testManyISSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testManyISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testManyISendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testManyISendAndRecv) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testManyISendAndRecv) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testManyISendAndRecv) ... ok testManyISendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testManyISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMixed (test_p2p_obj.TestP2PObjWorldDup.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjWorldDup.testPingPong01) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMixed (test_p2p_obj.TestP2PObjWorldDup.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjWorldDup.testPingPong01) ... ok testMixed (test_p2p_obj.TestP2PObjWorldDup.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjWorldDup.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjWorldDup.testProbe) ... ok testMixed (test_p2p_obj.TestP2PObjWorldDup.testMixed) ... ok testPingPong01 (test_p2p_obj.TestP2PObjWorldDup.testPingPong01) ... ok testProbe (test_p2p_obj.TestP2PObjWorldDup.testProbe) ... ok testProbe (test_p2p_obj.TestP2PObjWorldDup.testProbe) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjWorldDup.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testSendAndRecv) ... ok testProbe (test_p2p_obj.TestP2PObjWorldDup.testProbe) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjWorldDup.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testSendAndRecv) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjWorldDup.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testSendAndRecv) ... ok testRecvObjArg (test_p2p_obj.TestP2PObjWorldDup.testRecvObjArg) ... ok testSSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testSSendAndRecv) ... ok testSendAndRecv (test_p2p_obj.TestP2PObjWorldDup.testSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_obj.TestP2PObjWorldDup.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjWorldDup.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjWorldDup.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjWorldDup.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjWorldDup.testWaitSomeSend) ... ok testMessageNoProc (test_p2p_obj_matched.TestMessage.testMessageNoProc) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendrecv (test_p2p_obj.TestP2PObjWorldDup.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjWorldDup.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjWorldDup.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjWorldDup.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjWorldDup.testWaitSomeSend) ... ok testMessageNoProc (test_p2p_obj_matched.TestMessage.testMessageNoProc) ... ok testMessageNull (test_p2p_obj_matched.TestMessage.testMessageNull) ... ok testSendrecv (test_p2p_obj.TestP2PObjWorldDup.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjWorldDup.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjWorldDup.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjWorldDup.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjWorldDup.testWaitSomeSend) ... ok testMessageNoProc (test_p2p_obj_matched.TestMessage.testMessageNoProc) ... ok ok testSendrecv (test_p2p_obj.TestP2PObjWorldDup.testSendrecv) ... ok testTestSomeRecv (test_p2p_obj.TestP2PObjWorldDup.testTestSomeRecv) ... ok testTestSomeSend (test_p2p_obj.TestP2PObjWorldDup.testTestSomeSend) ... ok testWaitSomeRecv (test_p2p_obj.TestP2PObjWorldDup.testWaitSomeRecv) ... ok testWaitSomeSend (test_p2p_obj.TestP2PObjWorldDup.testWaitSomeSend) ... ok testMessageNoProc (test_p2p_obj_matched.TestMessage.testMessageNoProc) ... ok testMessageNull (test_p2p_obj_matched.TestMessage.testMessageNull) ... testMessageNull (test_p2p_obj_matched.TestMessage.testMessageNull) ... testMessageNull (test_p2p_obj_matched.TestMessage.testMessageNull) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedSelf.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedSelf.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedSelfDup.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedSelfDup.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedWorld.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedWorld.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedSelf.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedSelf.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedSelfDup.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedSelfDup.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedWorld.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedWorld.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedSelf.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedSelf.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedSelfDup.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedSelfDup.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedWorld.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedWorld.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedWorldDup.testIMProbe) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedSelf.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedSelf.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedSelfDup.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedSelfDup.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedWorld.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedWorld.testProbeRecv) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedWorldDup.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedWorldDup.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedWorldDup.testProbeRecv) ... ok testPackSize (test_pack.TestPackExternal.testPackSize) ... ok testIMProbe (test_p2p_obj_matched.TestP2PMatchedWorldDup.testIMProbe) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedWorldDup.testProbeRecv) ... ok testPackSize (test_pack.TestPackExternal.testPackSize) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedWorldDup.testProbeRecv) ... ok testPackSize (test_pack.TestPackExternal.testPackSize) ... ok testProbeRecv (test_p2p_obj_matched.TestP2PMatchedWorldDup.testProbeRecv) ... ok testPackSize (test_pack.TestPackExternal.testPackSize) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackUnpackExternal (test_pack.TestPackExternal.testPackUnpackExternal) ... ok testPackUnpackExternal (test_pack.TestPackExternal.testPackUnpackExternal) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackUnpackExternal (test_pack.TestPackExternal.testPackUnpackExternal) ... ok testPackUnpackExternal (test_pack.TestPackExternal.testPackUnpackExternal) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackSize (test_pack.TestPackSelf.testPackSize) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackUnpack (test_pack.TestPackSelf.testPackUnpack) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackSize (test_pack.TestPackSelf.testPackSize) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackUnpack (test_pack.TestPackSelf.testPackUnpack) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackSize (test_pack.TestPackSelf.testPackSize) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackSize (test_pack.TestPackSelf.testPackSize) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackUnpack (test_pack.TestPackSelf.testPackUnpack) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackUnpack (test_pack.TestPackSelf.testPackUnpack) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackSize (test_pack.TestPackWorld.testPackSize) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackUnpack (test_pack.TestPackWorld.testPackUnpack) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackSize (test_pack.TestPackWorld.testPackSize) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackUnpack (test_pack.TestPackWorld.testPackUnpack) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackSize (test_pack.TestPackWorld.testPackSize) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackSize (test_pack.TestPackWorld.testPackSize) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackUnpack (test_pack.TestPackWorld.testPackUnpack) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPackUnpack (test_pack.TestPackWorld.testPackUnpack) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCython (test_package.TestDataFiles.testCython) ... ok testHeaders (test_package.TestDataFiles.testHeaders) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testTyping (test_package.TestDataFiles.testTyping) ... ok testImportBench (test_package.TestImport.testImportBench) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportFutures (test_package.TestImport.testImportFutures) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportMPI (test_package.TestImport.testImportMPI) ... ok testImportRun (test_package.TestImport.testImportRun) ... ok testImportTyping (test_package.TestImport.testImportTyping) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportUtil (test_package.TestImport.testImportUtil) ... ok testDefault (test_pickle.TestPickle.testDefault) ... ok testDill (test_pickle.TestPickle.testDill) ... skipped 'dill' testJson (test_pickle.TestPickle.testJson) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMarshal (test_pickle.TestPickle.testMarshal) ... ok testPyPickle (test_pickle.TestPickle.testPyPickle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testYAML (test_pickle.TestPickle.testYAML) ... skipped 'yaml' testGetStatus (test_request.TestRequest.testGetStatus) ... ok testTest (test_request.TestRequest.testTest) ... ok testWait (test_request.TestRequest.testWait) ... ok testGetStatusAll (test_request.TestRequestArray.testGetStatusAll) ... ok testGetStatusAny (test_request.TestRequestArray.testGetStatusAny) ... ok testGetStatusSome (test_request.TestRequestArray.testGetStatusSome) ... ok testTestall (test_request.TestRequestArray.testTestall) ... ok testTestany (test_request.TestRequestArray.testTestany) ... ok testTestsome (test_request.TestRequestArray.testTestsome) ... ok testWaitall (test_request.TestRequestArray.testWaitall) ... ok testWaitany (test_request.TestRequestArray.testWaitany) ... ok testWaitsome (test_request.TestRequestArray.testWaitsome) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAccumulate (test_rma.TestRMASelf.testAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCython (test_package.TestDataFiles.testCython) ... ok testHeaders (test_package.TestDataFiles.testHeaders) ... ok testTyping (test_package.TestDataFiles.testTyping) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportBench (test_package.TestImport.testImportBench) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportFutures (test_package.TestImport.testImportFutures) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportMPI (test_package.TestImport.testImportMPI) ... ok testImportRun (test_package.TestImport.testImportRun) ... ok testImportTyping (test_package.TestImport.testImportTyping) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportUtil (test_package.TestImport.testImportUtil) ... ok testDefault (test_pickle.TestPickle.testDefault) ... ok testDill (test_pickle.TestPickle.testDill) ... skipped 'dill' testJson (test_pickle.TestPickle.testJson) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMarshal (test_pickle.TestPickle.testMarshal) ... ok testPyPickle (test_pickle.TestPickle.testPyPickle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testYAML (test_pickle.TestPickle.testYAML) ... skipped 'yaml' testGetStatus (test_request.TestRequest.testGetStatus) ... ok testTest (test_request.TestRequest.testTest) ... ok testWait (test_request.TestRequest.testWait) ... ok testGetStatusAll (test_request.TestRequestArray.testGetStatusAll) ... ok testGetStatusAny (test_request.TestRequestArray.testGetStatusAny) ... ok testGetStatusSome (test_request.TestRequestArray.testGetStatusSome) ... ok testTestall (test_request.TestRequestArray.testTestall) ... ok testTestany (test_request.TestRequestArray.testTestany) ... ok testTestsome (test_request.TestRequestArray.testTestsome) ... ok testWaitall (test_request.TestRequestArray.testWaitall) ... ok testWaitany (test_request.TestRequestArray.testWaitany) ... ok testWaitsome (test_request.TestRequestArray.testWaitsome) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAccumulate (test_rma.TestRMASelf.testAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAccumulateProcNullReplace (test_rma.TestRMASelf.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma.TestRMASelf.testAccumulateProcNullSum) ... ok testCompareAndSwap (test_rma.TestRMASelf.testCompareAndSwap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFence (test_rma.TestRMASelf.testFence) ... ok testFenceAll (test_rma.TestRMASelf.testFenceAll) ... ok testFetchAndOp (test_rma.TestRMASelf.testFetchAndOp) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFlush (test_rma.TestRMASelf.testFlush) ... ok testGetAccumulate (test_rma.TestRMASelf.testGetAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAccumulateProcNullReplace (test_rma.TestRMASelf.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma.TestRMASelf.testAccumulateProcNullSum) ... ok testCompareAndSwap (test_rma.TestRMASelf.testCompareAndSwap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFence (test_rma.TestRMASelf.testFence) ... ok testFenceAll (test_rma.TestRMASelf.testFenceAll) ... ok testFetchAndOp (test_rma.TestRMASelf.testFetchAndOp) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFlush (test_rma.TestRMASelf.testFlush) ... ok testGetAccumulate (test_rma.TestRMASelf.testGetAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCython (test_package.TestDataFiles.testCython) ... ok testHeaders (test_package.TestDataFiles.testHeaders) ... ok testTyping (test_package.TestDataFiles.testTyping) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportBench (test_package.TestImport.testImportBench) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCython (test_package.TestDataFiles.testCython) ... ok testHeaders (test_package.TestDataFiles.testHeaders) ... ok testTyping (test_package.TestDataFiles.testTyping) ... ok testImportBench (test_package.TestImport.testImportBench) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportFutures (test_package.TestImport.testImportFutures) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportMPI (test_package.TestImport.testImportMPI) ... ok testImportRun (test_package.TestImport.testImportRun) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportFutures (test_package.TestImport.testImportFutures) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportTyping (test_package.TestImport.testImportTyping) ... ok testImportMPI (test_package.TestImport.testImportMPI) ... ok testImportRun (test_package.TestImport.testImportRun) ... ok testImportTyping (test_package.TestImport.testImportTyping) ... ok testImportUtil (test_package.TestImport.testImportUtil) ... ok testDefault (test_pickle.TestPickle.testDefault) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testImportUtil (test_package.TestImport.testImportUtil) ... ok testDefault (test_pickle.TestPickle.testDefault) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDill (test_pickle.TestPickle.testDill) ... skipped 'dill' testJson (test_pickle.TestPickle.testJson) ... ok testMarshal (test_pickle.TestPickle.testMarshal) ... ok testDill (test_pickle.TestPickle.testDill) ... skipped 'dill' testJson (test_pickle.TestPickle.testJson) ... ok testPyPickle (test_pickle.TestPickle.testPyPickle) ... ok ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testMarshal (test_pickle.TestPickle.testMarshal) ... ok testPyPickle (test_pickle.TestPickle.testPyPickle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testYAML (test_pickle.TestPickle.testYAML) ... skipped 'yaml' testGetStatus (test_request.TestRequest.testGetStatus) ... ok testTest (test_request.TestRequest.testTest) ... ok testWait (test_request.TestRequest.testWait) ... ok testGetStatusAll (test_request.TestRequestArray.testGetStatusAll) ... ok testGetStatusAny (test_request.TestRequestArray.testGetStatusAny) ... ok testGetStatusSome (test_request.TestRequestArray.testGetStatusSome) ... ok testTestall (test_request.TestRequestArray.testTestall) ... ok testTestany (test_request.TestRequestArray.testTestany) ... ok testTestsome (test_request.TestRequestArray.testTestsome) ... ok testWaitall (test_request.TestRequestArray.testWaitall) ... ok testWaitany (test_request.TestRequestArray.testWaitany) ... ok testWaitsome (test_request.TestRequestArray.testWaitsome) ... ok testAccumulate (test_rma.TestRMASelf.testAccumulate) ... ok testYAML (test_pickle.TestPickle.testYAML) ... skipped 'yaml' testGetStatus (test_request.TestRequest.testGetStatus) ... ok testTest (test_request.TestRequest.testTest) ... ok testWait (test_request.TestRequest.testWait) ... ok testGetStatusAll (test_request.TestRequestArray.testGetStatusAll) ... ok testGetStatusAny (test_request.TestRequestArray.testGetStatusAny) ... ok testGetStatusSome (test_request.TestRequestArray.testGetStatusSome) ... ok testTestall (test_request.TestRequestArray.testTestall) ... ok testTestany (test_request.TestRequestArray.testTestany) ... ok testTestsome (test_request.TestRequestArray.testTestsome) ... ok testWaitall (test_request.TestRequestArray.testWaitall) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testWaitany (test_request.TestRequestArray.testWaitany) ... ok testWaitsome (test_request.TestRequestArray.testWaitsome) ... ok testAccumulate (test_rma.TestRMASelf.testAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAccumulateProcNullReplace (test_rma.TestRMASelf.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma.TestRMASelf.testAccumulateProcNullSum) ... ok testCompareAndSwap (test_rma.TestRMASelf.testCompareAndSwap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAccumulateProcNullReplace (test_rma.TestRMASelf.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma.TestRMASelf.testAccumulateProcNullSum) ... ok testCompareAndSwap (test_rma.TestRMASelf.testCompareAndSwap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFence (test_rma.TestRMASelf.testFence) ... ok testFenceAll (test_rma.TestRMASelf.testFenceAll) ... ok testFetchAndOp (test_rma.TestRMASelf.testFetchAndOp) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFence (test_rma.TestRMASelf.testFence) ... ok testFenceAll (test_rma.TestRMASelf.testFenceAll) ... ok testFetchAndOp (test_rma.TestRMASelf.testFetchAndOp) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetAccumulateProcNull (test_rma.TestRMASelf.testGetAccumulateProcNull) ... ok testGetProcNull (test_rma.TestRMASelf.testGetProcNull) ... ok testPostWait (test_rma.TestRMASelf.testPostWait) ... ok testPutGet (test_rma.TestRMASelf.testPutGet) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFlush (test_rma.TestRMASelf.testFlush) ... ok testGetAccumulate (test_rma.TestRMASelf.testGetAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFlush (test_rma.TestRMASelf.testFlush) ... ok testGetAccumulate (test_rma.TestRMASelf.testGetAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetAccumulateProcNull (test_rma.TestRMASelf.testGetAccumulateProcNull) ... ok testGetProcNull (test_rma.TestRMASelf.testGetProcNull) ... ok testPostWait (test_rma.TestRMASelf.testPostWait) ... ok testPutGet (test_rma.TestRMASelf.testPutGet) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPutProcNull (test_rma.TestRMASelf.testPutProcNull) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStartComplete (test_rma.TestRMASelf.testStartComplete) ... ok testStartCompletePostTest (test_rma.TestRMASelf.testStartCompletePostTest) ... ok testStartCompletePostWait (test_rma.TestRMASelf.testStartCompletePostWait) ... ok testSync (test_rma.TestRMASelf.testSync) ... ok testAccumulate (test_rma.TestRMAWorld.testAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPutProcNull (test_rma.TestRMASelf.testPutProcNull) ... ok testStartComplete (test_rma.TestRMASelf.testStartComplete) ... ok testStartCompletePostTest (test_rma.TestRMASelf.testStartCompletePostTest) ... ok testStartCompletePostWait (test_rma.TestRMASelf.testStartCompletePostWait) ... ok testSync (test_rma.TestRMASelf.testSync) ... ok testAccumulate (test_rma.TestRMAWorld.testAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetAccumulateProcNull (test_rma.TestRMASelf.testGetAccumulateProcNull) ... ok testGetProcNull (test_rma.TestRMASelf.testGetProcNull) ... ok testPostWait (test_rma.TestRMASelf.testPostWait) ... ok testPutGet (test_rma.TestRMASelf.testPutGet) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetAccumulateProcNull (test_rma.TestRMASelf.testGetAccumulateProcNull) ... ok testGetProcNull (test_rma.TestRMASelf.testGetProcNull) ... ok testPostWait (test_rma.TestRMASelf.testPostWait) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testPutGet (test_rma.TestRMASelf.testPutGet) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPutProcNull (test_rma.TestRMASelf.testPutProcNull) ... ok testStartComplete (test_rma.TestRMASelf.testStartComplete) ... ok testStartCompletePostTest (test_rma.TestRMASelf.testStartCompletePostTest) ... ok testStartCompletePostWait (test_rma.TestRMASelf.testStartCompletePostWait) ... ok testSync (test_rma.TestRMASelf.testSync) ... ok testAccumulate (test_rma.TestRMAWorld.testAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPutProcNull (test_rma.TestRMASelf.testPutProcNull) ... ok testStartComplete (test_rma.TestRMASelf.testStartComplete) ... ok testStartCompletePostTest (test_rma.TestRMASelf.testStartCompletePostTest) ... ok testStartCompletePostWait (test_rma.TestRMASelf.testStartCompletePostWait) ... ok testSync (test_rma.TestRMASelf.testSync) ... ok testAccumulate (test_rma.TestRMAWorld.testAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAccumulateProcNullReplace (test_rma.TestRMAWorld.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma.TestRMAWorld.testAccumulateProcNullSum) ... ok testCompareAndSwap (test_rma.TestRMAWorld.testCompareAndSwap) ... ok testAccumulateProcNullReplace (test_rma.TestRMAWorld.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma.TestRMAWorld.testAccumulateProcNullSum) ... ok testCompareAndSwap (test_rma.TestRMAWorld.testCompareAndSwap) ... ok testAccumulateProcNullReplace (test_rma.TestRMAWorld.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma.TestRMAWorld.testAccumulateProcNullSum) ... ok testCompareAndSwap (test_rma.TestRMAWorld.testCompareAndSwap) ... ok testAccumulateProcNullReplace (test_rma.TestRMAWorld.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma.TestRMAWorld.testAccumulateProcNullSum) ... ok testCompareAndSwap (test_rma.TestRMAWorld.testCompareAndSwap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFence (test_rma.TestRMAWorld.testFence) ... ok testFenceAll (test_rma.TestRMAWorld.testFenceAll) ... ok testFetchAndOp (test_rma.TestRMAWorld.testFetchAndOp) ... ok testFence (test_rma.TestRMAWorld.testFence) ... ok testFenceAll (test_rma.TestRMAWorld.testFenceAll) ... ok testFetchAndOp (test_rma.TestRMAWorld.testFetchAndOp) ... ok testFence (test_rma.TestRMAWorld.testFence) ... ok testFenceAll (test_rma.TestRMAWorld.testFenceAll) ... ok testFetchAndOp (test_rma.TestRMAWorld.testFetchAndOp) ... ok testFence (test_rma.TestRMAWorld.testFence) ... ok testFenceAll (test_rma.TestRMAWorld.testFenceAll) ... ok testFetchAndOp (test_rma.TestRMAWorld.testFetchAndOp) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFlush (test_rma.TestRMAWorld.testFlush) ... ok testFlush (test_rma.TestRMAWorld.testFlush) ... ok testFlush (test_rma.TestRMAWorld.testFlush) ... ok testFlush (test_rma.TestRMAWorld.testFlush) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetAccumulate (test_rma.TestRMAWorld.testGetAccumulate) ... ok testGetAccumulate (test_rma.TestRMAWorld.testGetAccumulate) ... ok testGetAccumulate (test_rma.TestRMAWorld.testGetAccumulate) ... ok testGetAccumulate (test_rma.TestRMAWorld.testGetAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetAccumulateProcNull (test_rma.TestRMAWorld.testGetAccumulateProcNull) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetAccumulateProcNull (test_rma.TestRMAWorld.testGetAccumulateProcNull) ... ok testGetAccumulateProcNull (test_rma.TestRMAWorld.testGetAccumulateProcNull) ... ok testGetAccumulateProcNull (test_rma.TestRMAWorld.testGetAccumulateProcNull) ... ok testGetProcNull (test_rma.TestRMAWorld.testGetProcNull) ... ok testPostWait (test_rma.TestRMAWorld.testPostWait) ... ok testPutGet (test_rma.TestRMAWorld.testPutGet) ... ok testGetProcNull (test_rma.TestRMAWorld.testGetProcNull) ... ok testPostWait (test_rma.TestRMAWorld.testPostWait) ... ok testPutGet (test_rma.TestRMAWorld.testPutGet) ... ok testGetProcNull (test_rma.TestRMAWorld.testGetProcNull) ... ok testPostWait (test_rma.TestRMAWorld.testPostWait) ... ok testPutGet (test_rma.TestRMAWorld.testPutGet) ... ok testGetProcNull (test_rma.TestRMAWorld.testGetProcNull) ... ok testPostWait (test_rma.TestRMAWorld.testPostWait) ... ok testPutGet (test_rma.TestRMAWorld.testPutGet) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPutProcNull (test_rma.TestRMAWorld.testPutProcNull) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPutProcNull (test_rma.TestRMAWorld.testPutProcNull) ... ok testPutProcNull (test_rma.TestRMAWorld.testPutProcNull) ... ok testPutProcNull (test_rma.TestRMAWorld.testPutProcNull) ... ok testStartComplete (test_rma.TestRMAWorld.testStartComplete) ... ok testStartCompletePostTest (test_rma.TestRMAWorld.testStartCompletePostTest) ... ok testStartCompletePostWait (test_rma.TestRMAWorld.testStartCompletePostWait) ... ok testStartComplete (test_rma.TestRMAWorld.testStartComplete) ... ok testStartCompletePostTest (test_rma.TestRMAWorld.testStartCompletePostTest) ... ok testStartCompletePostWait (test_rma.TestRMAWorld.testStartCompletePostWait) ... ok testStartComplete (test_rma.TestRMAWorld.testStartComplete) ... ok testStartCompletePostTest (test_rma.TestRMAWorld.testStartCompletePostTest) ... ok testStartCompletePostWait (test_rma.TestRMAWorld.testStartCompletePostWait) ... ok testStartComplete (test_rma.TestRMAWorld.testStartComplete) ... ok testStartCompletePostTest (test_rma.TestRMAWorld.testStartCompletePostTest) ... ok testStartCompletePostWait (test_rma.TestRMAWorld.testStartCompletePostWait) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSync (test_rma.TestRMAWorld.testSync) ... ok testAccumulate (test_rma_nb.TestRMASelf.testAccumulate) ... ok testSync (test_rma.TestRMAWorld.testSync) ... ok testAccumulate (test_rma_nb.TestRMASelf.testAccumulate) ... ok testSync (test_rma.TestRMAWorld.testSync) ... ok testAccumulate (test_rma_nb.TestRMASelf.testAccumulate) ... ok testSync (test_rma.TestRMAWorld.testSync) ... ok testAccumulate (test_rma_nb.TestRMASelf.testAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAccumulateProcNullReplace (test_rma_nb.TestRMASelf.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma_nb.TestRMASelf.testAccumulateProcNullSum) ... ok testGetAccumulate (test_rma_nb.TestRMASelf.testGetAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAccumulateProcNullReplace (test_rma_nb.TestRMASelf.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma_nb.TestRMASelf.testAccumulateProcNullSum) ... ok testGetAccumulate (test_rma_nb.TestRMASelf.testGetAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAccumulateProcNullReplace (test_rma_nb.TestRMASelf.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma_nb.TestRMASelf.testAccumulateProcNullSum) ... ok testGetAccumulate (test_rma_nb.TestRMASelf.testGetAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAccumulateProcNullReplace (test_rma_nb.TestRMASelf.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma_nb.TestRMASelf.testAccumulateProcNullSum) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetAccumulate (test_rma_nb.TestRMASelf.testGetAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetProcNull (test_rma_nb.TestRMASelf.testGetProcNull) ... ok testPutGet (test_rma_nb.TestRMASelf.testPutGet) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetProcNull (test_rma_nb.TestRMASelf.testGetProcNull) ... ok testPutGet (test_rma_nb.TestRMASelf.testPutGet) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPutProcNull (test_rma_nb.TestRMASelf.testPutProcNull) ... ok testAccumulate (test_rma_nb.TestRMAWorld.testAccumulate) ... ok testGetProcNull (test_rma_nb.TestRMASelf.testGetProcNull) ... ok testPutGet (test_rma_nb.TestRMASelf.testPutGet) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetProcNull (test_rma_nb.TestRMASelf.testGetProcNull) ... ok testPutGet (test_rma_nb.TestRMASelf.testPutGet) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPutProcNull (test_rma_nb.TestRMASelf.testPutProcNull) ... ok testAccumulate (test_rma_nb.TestRMAWorld.testAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPutProcNull (test_rma_nb.TestRMASelf.testPutProcNull) ... ok testAccumulate (test_rma_nb.TestRMAWorld.testAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPutProcNull (test_rma_nb.TestRMASelf.testPutProcNull) ... ok testAccumulate (test_rma_nb.TestRMAWorld.testAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAccumulateProcNullReplace (test_rma_nb.TestRMAWorld.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma_nb.TestRMAWorld.testAccumulateProcNullSum) ... ok testGetAccumulate (test_rma_nb.TestRMAWorld.testGetAccumulate) ... ok testAccumulateProcNullReplace (test_rma_nb.TestRMAWorld.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma_nb.TestRMAWorld.testAccumulateProcNullSum) ... ok testGetAccumulate (test_rma_nb.TestRMAWorld.testGetAccumulate) ... ok testAccumulateProcNullReplace (test_rma_nb.TestRMAWorld.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma_nb.TestRMAWorld.testAccumulateProcNullSum) ... ok testGetAccumulate (test_rma_nb.TestRMAWorld.testGetAccumulate) ... ok testAccumulateProcNullReplace (test_rma_nb.TestRMAWorld.testAccumulateProcNullReplace) ... ok testAccumulateProcNullSum (test_rma_nb.TestRMAWorld.testAccumulateProcNullSum) ... ok testGetAccumulate (test_rma_nb.TestRMAWorld.testGetAccumulate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetProcNull (test_rma_nb.TestRMAWorld.testGetProcNull) ... ok testPutGet (test_rma_nb.TestRMAWorld.testPutGet) ... ok testGetProcNull (test_rma_nb.TestRMAWorld.testGetProcNull) ... ok testPutGet (test_rma_nb.TestRMAWorld.testPutGet) ... ok testGetProcNull (test_rma_nb.TestRMAWorld.testGetProcNull) ... ok testPutGet (test_rma_nb.TestRMAWorld.testPutGet) ... ok testGetProcNull (test_rma_nb.TestRMAWorld.testGetProcNull) ... ok testPutGet (test_rma_nb.TestRMAWorld.testPutGet) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPutProcNull (test_rma_nb.TestRMAWorld.testPutProcNull) ... ok testPutProcNull (test_rma_nb.TestRMAWorld.testPutProcNull) ... ok testPutProcNull (test_rma_nb.TestRMAWorld.testPutProcNull) ... ok testPutProcNull (test_rma_nb.TestRMAWorld.testPutProcNull) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBuffering (test_session.TestSession.testBuffering) ... ok testPickle (test_session.TestSession.testPickle) ... ok testSessionGetInfo (test_session.TestSession.testSessionGetInfo) ... ok testSessionInit (test_session.TestSession.testSessionInit) ... ok testSessionPsetGroup (test_session.TestSession.testSessionPsetGroup) ... ok testSessionPsetInfo (test_session.TestSession.testSessionPsetInfo) ... ok testSessionPsets (test_session.TestSession.testSessionPsets) ... ok testSessionSELF (test_session.TestSession.testSessionSELF) ... ok ok testBuffering (test_session.TestSession.testBuffering) ... ok testPickle (test_session.TestSession.testPickle) ... ok testSessionGetInfo (test_session.TestSession.testSessionGetInfo) ... ok testSessionInit (test_session.TestSession.testSessionInit) ... ok testSessionPsetGroup (test_session.TestSession.testSessionPsetGroup) ... ok testSessionPsetInfo (test_session.TestSession.testSessionPsetInfo) ... ok testSessionPsets (test_session.TestSession.testSessionPsets) ... ok testBuffering (test_session.TestSession.testBuffering) ... ok testPickle (test_session.TestSession.testPickle) ... ok testSessionGetInfo (test_session.TestSession.testSessionGetInfo) ... ok testSessionInit (test_session.TestSession.testSessionInit) ... ok testSessionPsetGroup (test_session.TestSession.testSessionPsetGroup) ... ok testSessionPsetInfo (test_session.TestSession.testSessionPsetInfo) ... ok ok testBuffering (test_session.TestSession.testBuffering) ... ok testPickle (test_session.TestSession.testPickle) ... ok testSessionGetInfo (test_session.TestSession.testSessionGetInfo) ... ok testSessionInit (test_session.TestSession.testSessionInit) ... ok testSessionPsetGroup (test_session.TestSession.testSessionPsetGroup) ... ok testSessionPsetInfo (test_session.TestSession.testSessionPsetInfo) ... ok testSessionPsets (test_session.TestSession.testSessionPsets) ... ok [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] ok we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd testSessionPsets (test_session.TestSession.testSessionPsets) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testSessionSELF (test_session.TestSession.testSessionSELF) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testSessionSELF (test_session.TestSession.testSessionSELF) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_1_1447012157_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI ok testSessionWORLD (test_session.TestSession.testSessionWORLD) ... ok testArgsBad (test_spawn.TestSpawnMultipleSelf.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleSelf.testArgsOnlyAtRoot) ... ok testSessionSELF (test_session.TestSession.testSessionSELF) ... ok testSessionWORLD (test_session.TestSession.testSessionWORLD) ... ok testArgsBad (test_spawn.TestSpawnMultipleSelf.testArgsBad) ... ok testSessionWORLD (test_session.TestSession.testSessionWORLD) ... ok testArgsBad (test_spawn.TestSpawnMultipleSelf.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleSelf.testArgsOnlyAtRoot) ... [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_2_743006281_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleSelf.testArgsOnlyAtRoot) ... [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_3_1511529367_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream testSessionWORLD (test_session.TestSession.testSessionWORLD) ... ok testArgsBad (test_spawn.TestSpawnMultipleSelf.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleSelf.testArgsOnlyAtRoot) ... [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_4_574555749_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_1_1447012157_virt32a [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_1_1447012157_virt32a [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_1_1447012157_virt32a [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [mpiexec@virt32a] [pgid: 1] got PMI command: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:1:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 1] got PMI command: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:1:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_2_743006281_virt32a [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_4_574555749_virt32a [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 1] got PMI command: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:1:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_3_1511529367_virt32a [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=barrier_in [mpiexec@virt32a] [pgid: 2] got PMI command: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:2:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 4] got PMI command: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:4:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=put kvsname=kvs_27847_1_1447012157_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70475A4348504A00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70475A4348504A00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70475A4348504A00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70475A4348504A00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 1] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70475A4348504A00 [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 1] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:1:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70475A4348504A00 [mpiexec@virt32a] Sending internal PMI command (proxy:1:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70475A4348504A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=put kvsname=kvs_27847_4_574555749_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7059616D63475500 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7059616D63475500 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [mpiexec@virt32a] [pgid: 3] got PMI command: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:3:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70475A4348504A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_2_743006281_virt32a [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_3_1511529367_virt32a [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_4_574555749_virt32a [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_2_743006281_virt32a [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 2] got PMI command: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:2:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 3] got PMI command: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:3:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=put kvsname=kvs_27847_3_1511529367_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D704366714C647900 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704366714C647900 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_3_1511529367_virt32a [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [mpiexec@virt32a] [pgid: 2] got PMI command: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:2:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 4] got PMI command: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:4:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_4_574555749_virt32a [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=put kvsname=kvs_27847_1_1447012157_virt32a key=P0-businesscard value=description#virt32a$port#54029$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#54029$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=put kvsname=kvs_27847_1_1447012157_virt32a key=P2-businesscard value=description#virt32a$port#55869$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#55869$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=barrier_in [mpiexec@virt32a] [pgid: 3] got PMI command: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:3:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=put kvsname=kvs_27847_2_743006281_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70627346626C4900 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70627346626C4900 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=put kvsname=kvs_27847_1_1447012157_virt32a key=P1-businesscard value=description#virt32a$port#42825$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#42825$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#54029$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#55869$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#42825$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#54029$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#55869$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#42825$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 1] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#54029$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#55869$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#42825$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 1] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:1:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#54029$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#55869$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#42825$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:1:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [mpiexec@virt32a] [pgid: 4] got PMI command: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:4:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70627346626C4900 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70627346626C4900 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704366714C647900 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704366714C647900 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 2] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70627346626C4900 [mpiexec@virt32a] [pgid: 2] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:2:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70627346626C4900 [mpiexec@virt32a] Sending internal PMI command (proxy:2:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 3] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704366714C647900 [mpiexec@virt32a] [pgid: 3] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:3:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704366714C647900 [mpiexec@virt32a] Sending internal PMI command (proxy:3:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70627346626C4900 found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70627346626C4900 found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704366714C647900 found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704366714C647900 found=TRUE [mpiexec@virt32a] [pgid: 1] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:1:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7059616D63475500 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7059616D63475500 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 4] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7059616D63475500 [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 4] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:4:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7059616D63475500 [mpiexec@virt32a] Sending internal PMI command (proxy:4:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7059616D63475500 found=TRUE [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7059616D63475500 found=TRUE [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#54029$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#42825$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=get kvsname=kvs_27847_1_1447012157_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#55869$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=put kvsname=kvs_27847_4_574555749_virt32a key=P1-businesscard value=description#virt32a$port#38907$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#38907$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=put kvsname=kvs_27847_4_574555749_virt32a key=P0-businesscard value=description#virt32a$port#45103$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#45103$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=put kvsname=kvs_27847_4_574555749_virt32a key=P2-businesscard value=description#virt32a$port#41153$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#41153$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#38907$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#45103$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41153$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#38907$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#45103$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41153$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 4] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#38907$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#45103$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41153$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 4] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:4:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#38907$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#45103$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41153$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:4:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=put kvsname=kvs_27847_3_1511529367_virt32a key=P1-businesscard value=description#virt32a$port#51941$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#51941$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=put kvsname=kvs_27847_3_1511529367_virt32a key=P2-businesscard value=description#virt32a$port#52663$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#52663$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=put kvsname=kvs_27847_2_743006281_virt32a key=P0-businesscard value=description#virt32a$port#45669$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#45669$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=put kvsname=kvs_27847_2_743006281_virt32a key=P2-businesscard value=description#virt32a$port#41823$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#41823$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=put kvsname=kvs_27847_2_743006281_virt32a key=P1-businesscard value=description#virt32a$port#41181$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#41181$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=put kvsname=kvs_27847_3_1511529367_virt32a key=P0-businesscard value=description#virt32a$port#42453$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#42453$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#45669$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41823$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#41181$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#45669$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41823$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#41181$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#51941$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#52663$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42453$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#51941$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#52663$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42453$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 2] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#45669$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41823$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#41181$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 2] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:2:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#45669$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41823$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#41181$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:2:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 3] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#51941$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#52663$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42453$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 3] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:3:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#51941$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#52663$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42453$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:3:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 4] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:4:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45103$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#38907$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=get kvsname=kvs_27847_4_574555749_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#41153$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 2] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:2:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 3] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:3:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45669$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#41181$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#42453$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=get kvsname=kvs_27847_2_743006281_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#41823$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#51941$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=get kvsname=kvs_27847_3_1511529367_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#52663$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_5_996212552_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE ok testCommSpawn (test_spawn.TestSpawnMultipleSelf.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 1-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 1-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 1-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_6_605515905_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawn (test_spawn.TestSpawnMultipleSelf.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_5_996212552_virt32a [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_5_996212552_virt32a [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI ok testCommSpawn (test_spawn.TestSpawnMultipleSelf.testCommSpawn) ... [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_7_663597649_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [mpiexec@virt32a] [pgid: 5] got PMI command: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:5:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 5] got PMI command: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:5:0): cmd=get_result rc=1 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 4-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 4-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 4-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_8_126322094_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawn (test_spawn.TestSpawnMultipleSelf.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 2-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 2-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 2-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=put kvsname=kvs_27847_5_996212552_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7062695941777700 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7062695941777700 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7062695941777700 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7062695941777700 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 5] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7062695941777700 [mpiexec@virt32a] [pgid: 5] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:5:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7062695941777700 [mpiexec@virt32a] Sending internal PMI command (proxy:5:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7062695941777700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 3-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 3-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 3-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=put kvsname=kvs_27847_5_996212552_virt32a key=P0-businesscard value=description#virt32a$port#58981$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#58981$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=put kvsname=kvs_27847_5_996212552_virt32a key=P1-businesscard value=description#virt32a$port#36219$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#36219$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#58981$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36219$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#58981$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36219$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 5] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#58981$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36219$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 5] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:5:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#58981$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36219$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:5:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_8_126322094_virt32a [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_6_605515905_virt32a [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_6_605515905_virt32a [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 8] got PMI command: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:8:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 6] got PMI command: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:6:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=put kvsname=kvs_27847_6_605515905_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D707375736A597A00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707375736A597A00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=barrier_in [mpiexec@virt32a] [pgid: 6] got PMI command: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:6:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_7_663597649_virt32a [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_8_126322094_virt32a [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_7_663597649_virt32a [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707375736A597A00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707375736A597A00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 8] got PMI command: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:8:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 6] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707375736A597A00 [mpiexec@virt32a] [pgid: 6] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:6:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707375736A597A00 [mpiexec@virt32a] Sending internal PMI command (proxy:6:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 7] got PMI command: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:7:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 5] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:5:0): cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D707375736A597A00 found=TRUE [mpiexec@virt32a] [pgid: 7] got PMI command: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:7:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=put kvsname=kvs_27847_7_663597649_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7074524A46627200 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7074524A46627200 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=put kvsname=kvs_27847_8_126322094_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D703041756E514400 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703041756E514400 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7074524A46627200 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7074524A46627200 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 7] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7074524A46627200 [mpiexec@virt32a] [pgid: 7] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:7:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7074524A46627200 [mpiexec@virt32a] Sending internal PMI command (proxy:7:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7074524A46627200 found=TRUE [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=put kvsname=kvs_27847_7_663597649_virt32a key=P0-businesscard value=description#virt32a$port#35393$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#35393$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703041756E514400 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703041756E514400 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 8] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703041756E514400 [mpiexec@virt32a] [pgid: 8] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:8:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703041756E514400 [mpiexec@virt32a] Sending internal PMI command (proxy:8:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=put kvsname=kvs_27847_7_663597649_virt32a key=P1-businesscard value=description#virt32a$port#34647$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#34647$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#35393$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34647$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#35393$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34647$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 7] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#35393$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34647$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 7] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:7:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#35393$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34647$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:7:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D703041756E514400 found=TRUE [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 7] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:7:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#58981$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=get kvsname=kvs_27847_5_996212552_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36219$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=put kvsname=kvs_27847_6_605515905_virt32a key=P0-businesscard value=description#virt32a$port#56021$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#56021$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=put kvsname=kvs_27847_6_605515905_virt32a key=P1-businesscard value=description#virt32a$port#37265$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#37265$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#56021$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#37265$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#56021$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#37265$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 6] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#56021$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#37265$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 6] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:6:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#56021$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#37265$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:6:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35393$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=get kvsname=kvs_27847_7_663597649_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34647$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=put kvsname=kvs_27847_8_126322094_virt32a key=P0-businesscard value=description#virt32a$port#36515$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#36515$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=put kvsname=kvs_27847_8_126322094_virt32a key=P1-businesscard value=description#virt32a$port#36153$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#36153$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#36515$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36153$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#36515$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36153$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 6] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:6:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 8] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#36515$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36153$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 8] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:8:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#36515$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36153$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:8:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 8] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:8:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#56021$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=get kvsname=kvs_27847_6_605515905_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#37265$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36515$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=get kvsname=kvs_27847_8_126322094_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36153$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_9_1718046467_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleSelf.testCommSpawnDefaults1) ... [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_10_1399388204_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleSelf.testCommSpawnDefaults1) ... [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_11_1552270352_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleSelf.testCommSpawnDefaults1) ... [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 7-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 7-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 8-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 8-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 5-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_9_1718046467_virt32a [proxy:0@virt32a] got pmi command from downstream 5-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 9] got PMI command: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:9:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_12_1741254948_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleSelf.testCommSpawnDefaults1) ... [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=put kvsname=kvs_27847_9_1718046467_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D706A5446317A4600 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706A5446317A4600 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_10_1399388204_virt32a [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 10] got PMI command: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:10:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=put kvsname=kvs_27847_10_1399388204_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7032566A374E5000 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7032566A374E5000 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_10_1399388204_virt32a [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_9_1718046467_virt32a [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 6-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 6-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 10] got PMI command: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:10:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 9] got PMI command: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:9:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_11_1552270352_virt32a [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7032566A374E5000 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7032566A374E5000 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 10] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7032566A374E5000 [mpiexec@virt32a] [pgid: 10] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:10:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7032566A374E5000 [mpiexec@virt32a] Sending internal PMI command (proxy:10:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706A5446317A4600 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706A5446317A4600 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7032566A374E5000 found=TRUE [mpiexec@virt32a] [pgid: 9] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706A5446317A4600 [mpiexec@virt32a] [pgid: 9] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:9:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706A5446317A4600 [mpiexec@virt32a] Sending internal PMI command (proxy:9:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 11] got PMI command: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:11:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706A5446317A4600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=put kvsname=kvs_27847_10_1399388204_virt32a key=P0-businesscard value=description#virt32a$port#44817$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#44817$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_12_1741254948_virt32a [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=put kvsname=kvs_27847_10_1399388204_virt32a key=P1-businesscard value=description#virt32a$port#43445$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#43445$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 12] got PMI command: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:12:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#44817$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#43445$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#44817$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#43445$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 10] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#44817$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#43445$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 10] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:10:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#44817$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#43445$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:10:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=put kvsname=kvs_27847_12_1741254948_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7077516A6D346C00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7077516A6D346C00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=put kvsname=kvs_27847_9_1718046467_virt32a key=P0-businesscard value=description#virt32a$port#48789$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#48789$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=put kvsname=kvs_27847_9_1718046467_virt32a key=P1-businesscard value=description#virt32a$port#44137$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#44137$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#48789$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#44137$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#48789$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#44137$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=barrier_in [mpiexec@virt32a] [pgid: 9] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#48789$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#44137$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 9] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:9:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#48789$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#44137$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:9:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 10] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:10:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_11_1552270352_virt32a [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 9] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:9:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 11] got PMI command: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:11:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#44817$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=put kvsname=kvs_27847_11_1552270352_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D704A3363376A6A00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704A3363376A6A00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=get kvsname=kvs_27847_10_1399388204_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#43445$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704A3363376A6A00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704A3363376A6A00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 11] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704A3363376A6A00 [mpiexec@virt32a] [pgid: 11] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:11:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704A3363376A6A00 [mpiexec@virt32a] Sending internal PMI command (proxy:11:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704A3363376A6A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_12_1741254948_virt32a [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=put kvsname=kvs_27847_11_1552270352_virt32a key=P0-businesscard value=description#virt32a$port#47125$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#47125$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=put kvsname=kvs_27847_11_1552270352_virt32a key=P1-businesscard value=description#virt32a$port#55379$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#55379$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#47125$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#55379$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#47125$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#55379$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 11] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#47125$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#55379$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 11] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:11:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#47125$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#55379$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:11:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 12] got PMI command: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:12:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48789$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=get kvsname=kvs_27847_9_1718046467_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#44137$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7077516A6D346C00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7077516A6D346C00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 12] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7077516A6D346C00 [mpiexec@virt32a] [pgid: 12] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:12:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7077516A6D346C00 [mpiexec@virt32a] Sending internal PMI command (proxy:12:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7077516A6D346C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=put kvsname=kvs_27847_12_1741254948_virt32a key=P0-businesscard value=description#virt32a$port#38793$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#38793$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=put kvsname=kvs_27847_12_1741254948_virt32a key=P1-businesscard value=description#virt32a$port#36867$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#36867$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#38793$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36867$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#38793$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36867$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 12] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#38793$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36867$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 12] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:12:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#38793$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36867$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:12:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 11] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:11:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 12] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:12:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47125$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=get kvsname=kvs_27847_11_1552270352_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#55379$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#38793$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=get kvsname=kvs_27847_12_1741254948_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36867$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_13_355277597_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleSelf.testCommSpawnDefaults2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_14_1225444705_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleSelf.testCommSpawnDefaults2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 10-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 10-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 9-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 9-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_15_1042170918_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleSelf.testCommSpawnDefaults2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_16_1508108784_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleSelf.testCommSpawnDefaults2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 12-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_14_1225444705_virt32a [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_14_1225444705_virt32a [proxy:0@virt32a] got pmi command from downstream 12-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 14] got PMI command: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:14:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 14] got PMI command: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:14:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_13_355277597_virt32a [proxy:0@virt32a] got pmi command from downstream 11-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 11-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_13_355277597_virt32a [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=put kvsname=kvs_27847_14_1225444705_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7041485A46337500 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7041485A46337500 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 13] got PMI command: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:13:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 13] got PMI command: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:13:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7041485A46337500 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7041485A46337500 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 14] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7041485A46337500 [mpiexec@virt32a] [pgid: 14] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:14:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7041485A46337500 [mpiexec@virt32a] Sending internal PMI command (proxy:14:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=put kvsname=kvs_27847_13_355277597_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D704773434D327A00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704773434D327A00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704773434D327A00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704773434D327A00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7041485A46337500 found=TRUE [mpiexec@virt32a] [pgid: 13] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704773434D327A00 [mpiexec@virt32a] [pgid: 13] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:13:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704773434D327A00 [mpiexec@virt32a] Sending internal PMI command (proxy:13:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=put kvsname=kvs_27847_14_1225444705_virt32a key=P0-businesscard value=description#virt32a$port#52337$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#52337$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=put kvsname=kvs_27847_14_1225444705_virt32a key=P1-businesscard value=description#virt32a$port#47625$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#47625$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#52337$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#47625$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#52337$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#47625$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 14] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#52337$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#47625$ifname#127.0.1.1$ [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704773434D327A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [mpiexec@virt32a] [pgid: 14] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:14:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#52337$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#47625$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:14:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_15_1042170918_virt32a [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 15] got PMI command: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:15:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=put kvsname=kvs_27847_15_1042170918_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D704F667133697200 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704F667133697200 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_15_1042170918_virt32a [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=put kvsname=kvs_27847_13_355277597_virt32a key=P0-businesscard value=description#virt32a$port#58327$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#58327$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=put kvsname=kvs_27847_13_355277597_virt32a key=P1-businesscard value=description#virt32a$port#58511$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#58511$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 14] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:14:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#58327$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#58511$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#58327$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#58511$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 13] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#58327$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#58511$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 13] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:13:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#58327$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#58511$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:13:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 15] got PMI command: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:15:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704F667133697200 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704F667133697200 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 15] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704F667133697200 [mpiexec@virt32a] [pgid: 15] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:15:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704F667133697200 [mpiexec@virt32a] Sending internal PMI command (proxy:15:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704F667133697200 found=TRUE [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#52337$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=get kvsname=kvs_27847_14_1225444705_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47625$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 13] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:13:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=put kvsname=kvs_27847_15_1042170918_virt32a key=P0-businesscard value=description#virt32a$port#58191$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#58191$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=put kvsname=kvs_27847_15_1042170918_virt32a key=P1-businesscard value=description#virt32a$port#33845$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#33845$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#58191$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33845$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#58191$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33845$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 15] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#58191$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33845$ifname#127.0.1.1$ [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 15] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:15:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#58191$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33845$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:15:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_16_1508108784_virt32a [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [mpiexec@virt32a] [pgid: 16] got PMI command: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:16:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_16_1508108784_virt32a [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 16] got PMI command: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:16:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 15] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:15:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#58327$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=put kvsname=kvs_27847_16_1508108784_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7041754E6A456B00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7041754E6A456B00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=get kvsname=kvs_27847_13_355277597_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#58511$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7041754E6A456B00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7041754E6A456B00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 16] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7041754E6A456B00 [mpiexec@virt32a] [pgid: 16] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:16:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7041754E6A456B00 [mpiexec@virt32a] Sending internal PMI command (proxy:16:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7041754E6A456B00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#58191$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=get kvsname=kvs_27847_15_1042170918_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#33845$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=put kvsname=kvs_27847_16_1508108784_virt32a key=P1-businesscard value=description#virt32a$port#37023$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#37023$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=put kvsname=kvs_27847_16_1508108784_virt32a key=P0-businesscard value=description#virt32a$port#60689$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#60689$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#37023$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#60689$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#37023$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#60689$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 16] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#37023$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#60689$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 16] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:16:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#37023$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#60689$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:16:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 16] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:16:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#60689$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=get kvsname=kvs_27847_16_1508108784_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#37023$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_17_889009645_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnMultipleSelf.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 14-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 14-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_18_949326903_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnMultipleSelf.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_19_275304827_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnMultipleSelf.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 13-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 13-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 15-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 15-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_17_889009645_virt32a [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_17_889009645_virt32a [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_17_889009645_virt32a [mpiexec@virt32a] [pgid: 17] got PMI command: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:17:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [mpiexec@virt32a] [pgid: 17] got PMI command: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:17:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=put kvsname=kvs_27847_17_889009645_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7070483930356C00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7070483930356C00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 17] got PMI command: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:17:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_20_878554174_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnMultipleSelf.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7070483930356C00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7070483930356C00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_18_949326903_virt32a [mpiexec@virt32a] [pgid: 17] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7070483930356C00 [mpiexec@virt32a] [pgid: 17] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:17:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7070483930356C00 [mpiexec@virt32a] Sending internal PMI command (proxy:17:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7070483930356C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7070483930356C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 18] got PMI command: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:18:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_18_949326903_virt32a [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_18_949326903_virt32a [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=barrier_in [mpiexec@virt32a] [pgid: 18] got PMI command: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:18:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_19_275304827_virt32a [proxy:0@virt32a] got pmi command from downstream 16-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 16-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=barrier_in [mpiexec@virt32a] [pgid: 18] got PMI command: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:18:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 19] got PMI command: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:19:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_19_275304827_virt32a [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=put kvsname=kvs_27847_18_949326903_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70584E7334443000 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70584E7334443000 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70584E7334443000 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70584E7334443000 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 18] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70584E7334443000 [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 18] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:18:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70584E7334443000 [mpiexec@virt32a] Sending internal PMI command (proxy:18:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 19] got PMI command: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:19:0): cmd=get_result rc=1 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70584E7334443000 found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70584E7334443000 found=TRUE [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_19_275304827_virt32a [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=put kvsname=kvs_27847_17_889009645_virt32a key=P2-businesscard value=description#virt32a$port#45009$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#45009$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_20_878554174_virt32a [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=put kvsname=kvs_27847_17_889009645_virt32a key=P0-businesscard value=description#virt32a$port#57911$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#57911$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=put kvsname=kvs_27847_17_889009645_virt32a key=P1-businesscard value=description#virt32a$port#49343$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#49343$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#45009$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#57911$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#49343$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#45009$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#57911$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#49343$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 17] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#45009$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#57911$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#49343$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 17] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:17:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#45009$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#57911$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#49343$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:17:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 19] got PMI command: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:19:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 20] got PMI command: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:20:0): cmd=get_result rc=1 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=put kvsname=kvs_27847_20_878554174_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D705A785A38753700 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705A785A38753700 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=put kvsname=kvs_27847_19_275304827_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D706F546533554100 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F546533554100 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F546533554100 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F546533554100 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 19] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F546533554100 [mpiexec@virt32a] [pgid: 19] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:19:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F546533554100 [mpiexec@virt32a] Sending internal PMI command (proxy:19:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706F546533554100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=put kvsname=kvs_27847_18_949326903_virt32a key=P1-businesscard value=description#virt32a$port#54009$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#54009$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=put kvsname=kvs_27847_18_949326903_virt32a key=P0-businesscard value=description#virt32a$port#36539$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#36539$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706F546533554100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=put kvsname=kvs_27847_18_949326903_virt32a key=P2-businesscard value=description#virt32a$port#39995$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#39995$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#54009$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#36539$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39995$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#54009$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#36539$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39995$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=barrier_in [mpiexec@virt32a] [pgid: 18] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#54009$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#36539$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39995$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 18] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:18:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#54009$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#36539$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39995$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:18:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=barrier_in [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_20_878554174_virt32a [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 17] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:17:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 20] got PMI command: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:20:0): cmd=get_result rc=1 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 18] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:18:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=put kvsname=kvs_27847_19_275304827_virt32a key=P0-businesscard value=description#virt32a$port#46849$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#46849$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=put kvsname=kvs_27847_19_275304827_virt32a key=P1-businesscard value=description#virt32a$port#34143$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#34143$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=put kvsname=kvs_27847_19_275304827_virt32a key=P2-businesscard value=description#virt32a$port#45667$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#45667$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#57911$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#46849$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34143$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#45667$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#46849$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34143$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#45667$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#49343$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 19] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#46849$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34143$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#45667$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 19] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:19:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#46849$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34143$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#45667$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:19:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=get kvsname=kvs_27847_17_889009645_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45009$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_20_878554174_virt32a [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36539$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#54009$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=get kvsname=kvs_27847_18_949326903_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#39995$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 20] got PMI command: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:20:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705A785A38753700 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705A785A38753700 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 19] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:19:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 20] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705A785A38753700 [mpiexec@virt32a] [pgid: 20] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:20:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705A785A38753700 [mpiexec@virt32a] Sending internal PMI command (proxy:20:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D705A785A38753700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D705A785A38753700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=put kvsname=kvs_27847_20_878554174_virt32a key=P2-businesscard value=description#virt32a$port#60313$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#60313$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=put kvsname=kvs_27847_20_878554174_virt32a key=P0-businesscard value=description#virt32a$port#58687$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#58687$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=put kvsname=kvs_27847_20_878554174_virt32a key=P1-businesscard value=description#virt32a$port#36843$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#36843$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#46849$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#60313$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#58687$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36843$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#60313$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#58687$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36843$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34143$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=get kvsname=kvs_27847_19_275304827_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45667$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 20] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#60313$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#58687$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36843$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 20] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:20:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#60313$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#58687$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36843$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:20:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 20] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:20:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#58687$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36843$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=get kvsname=kvs_27847_20_878554174_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#60313$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-c6bfb5tm.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-c6bfb5tm.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-c6bfb5tm.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-c6bfb5tm.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_21_313947593_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-c6bfb5tm.py --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-c6bfb5tm.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testNoArgs (test_spawn.TestSpawnMultipleSelf.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 18-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 18-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 18-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_21_313947593_virt32a [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 21] got PMI command: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:21:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-1n1o0hwh.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-1n1o0hwh.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-1n1o0hwh.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-1n1o0hwh.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_22_61898531_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-1n1o0hwh.py --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-1n1o0hwh.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 testNoArgs (test_spawn.TestSpawnMultipleSelf.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=put kvsname=kvs_27847_21_313947593_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7076355A48493200 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7076355A48493200 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_21_313947593_virt32a [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_21_313947593_virt32a [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 21] got PMI command: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:21:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 21] got PMI command: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:21:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7076355A48493200 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7076355A48493200 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 21] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7076355A48493200 [mpiexec@virt32a] [pgid: 21] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:21:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7076355A48493200 [mpiexec@virt32a] Sending internal PMI command (proxy:21:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7076355A48493200 found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7076355A48493200 found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-go29n2y3.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-go29n2y3.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-go29n2y3.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-go29n2y3.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_23_187764304_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-go29n2y3.py --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-go29n2y3.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testNoArgs (test_spawn.TestSpawnMultipleSelf.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 17-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 17-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=put kvsname=kvs_27847_21_313947593_virt32a key=P2-businesscard value=description#virt32a$port#36623$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#36623$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=put kvsname=kvs_27847_21_313947593_virt32a key=P0-businesscard value=description#virt32a$port#47013$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#47013$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 17-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnMultipleSelf.testNoArgs) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-zo01ryav.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-zo01ryav.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-zo01ryav.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-zo01ryav.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_24_1540351868_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-zo01ryav.py --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-zo01ryav.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=put kvsname=kvs_27847_21_313947593_virt32a key=P1-businesscard value=description#virt32a$port#59725$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#59725$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#36623$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47013$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#59725$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#36623$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47013$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#59725$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 21] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#36623$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47013$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#59725$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 21] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:21:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#36623$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47013$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#59725$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:21:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 19-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 19-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 19-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 21] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:21:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_22_61898531_virt32a [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 22] got PMI command: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:22:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47013$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#59725$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=get kvsname=kvs_27847_21_313947593_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36623$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 20-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 20-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 20-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_22_61898531_virt32a [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_22_61898531_virt32a [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 22] got PMI command: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:22:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 22] got PMI command: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:22:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=put kvsname=kvs_27847_22_61898531_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7063595963666800 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7063595963666800 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7063595963666800 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7063595963666800 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 22] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7063595963666800 [mpiexec@virt32a] [pgid: 22] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:22:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7063595963666800 [mpiexec@virt32a] Sending internal PMI command (proxy:22:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7063595963666800 found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_23_187764304_virt32a [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7063595963666800 found=TRUE [mpiexec@virt32a] [pgid: 23] got PMI command: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:23:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=put kvsname=kvs_27847_23_187764304_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D707667704C6D4400 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707667704C6D4400 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_23_187764304_virt32a [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 23] got PMI command: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:23:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_23_187764304_virt32a [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=put kvsname=kvs_27847_22_61898531_virt32a key=P0-businesscard value=description#virt32a$port#47915$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#47915$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=put kvsname=kvs_27847_22_61898531_virt32a key=P1-businesscard value=description#virt32a$port#60159$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#60159$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 23] got PMI command: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:23:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_24_1540351868_virt32a [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 24] got PMI command: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:24:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=put kvsname=kvs_27847_22_61898531_virt32a key=P2-businesscard value=description#virt32a$port#42275$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#42275$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#47915$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#60159$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#42275$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#47915$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#60159$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#42275$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_24_1540351868_virt32a [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 22] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#47915$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#60159$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#42275$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 22] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:22:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#47915$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#60159$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#42275$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:22:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707667704C6D4400 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707667704C6D4400 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=barrier_in [mpiexec@virt32a] [pgid: 24] got PMI command: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:24:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 23] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707667704C6D4400 [mpiexec@virt32a] [pgid: 23] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:23:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707667704C6D4400 [mpiexec@virt32a] Sending internal PMI command (proxy:23:0): cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D707667704C6D4400 found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D707667704C6D4400 found=TRUE [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=put kvsname=kvs_27847_24_1540351868_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7043766441587000 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7043766441587000 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_24_1540351868_virt32a [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=put kvsname=kvs_27847_23_187764304_virt32a key=P2-businesscard value=description#virt32a$port#46333$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#46333$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=put kvsname=kvs_27847_23_187764304_virt32a key=P0-businesscard value=description#virt32a$port#42949$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#42949$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 22] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:22:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 24] got PMI command: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:24:0): cmd=get_result rc=1 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=put kvsname=kvs_27847_23_187764304_virt32a key=P1-businesscard value=description#virt32a$port#38933$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#38933$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#46333$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42949$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#38933$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#46333$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42949$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#38933$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 23] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#46333$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42949$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#38933$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 23] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:23:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#46333$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42949$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#38933$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:23:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7043766441587000 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7043766441587000 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [mpiexec@virt32a] [pgid: 24] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7043766441587000 [mpiexec@virt32a] [pgid: 24] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:24:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7043766441587000 [mpiexec@virt32a] Sending internal PMI command (proxy:24:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7043766441587000 found=TRUE [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7043766441587000 found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47915$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#60159$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=get kvsname=kvs_27847_22_61898531_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#42275$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 23] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:23:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArgsBad (test_spawn.TestSpawnMultipleSelfMany.testArgsBad) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleSelfMany.testArgsOnlyAtRoot) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#42949$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_25_1938921375_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#38933$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=get kvsname=kvs_27847_23_187764304_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#46333$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=put kvsname=kvs_27847_24_1540351868_virt32a key=P0-businesscard value=description#virt32a$port#33975$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#33975$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=put kvsname=kvs_27847_24_1540351868_virt32a key=P1-businesscard value=description#virt32a$port#51477$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#51477$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=put kvsname=kvs_27847_24_1540351868_virt32a key=P2-businesscard value=description#virt32a$port#46555$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#46555$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#33975$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#51477$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#46555$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#33975$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#51477$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#46555$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 24] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#33975$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#51477$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#46555$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 24] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:24:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#33975$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#51477$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#46555$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:24:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 24] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:24:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 21-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#33975$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#51477$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=get kvsname=kvs_27847_24_1540351868_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#46555$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 21-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_25_1938921375_virt32a [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 25] got PMI command: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:25:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=put kvsname=kvs_27847_25_1938921375_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70785169596B3700 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70785169596B3700 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_25_1938921375_virt32a [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_26_129316938_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 ok testArgsBad (test_spawn.TestSpawnMultipleSelfMany.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleSelfMany.testArgsOnlyAtRoot) ... [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_25_1938921375_virt32a [mpiexec@virt32a] [pgid: 25] got PMI command: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:25:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 25] got PMI command: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:25:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70785169596B3700 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70785169596B3700 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 25] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70785169596B3700 [mpiexec@virt32a] [pgid: 25] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:25:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70785169596B3700 [mpiexec@virt32a] Sending internal PMI command (proxy:25:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70785169596B3700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70785169596B3700 found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_27_1029562426_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST ok testArgsBad (test_spawn.TestSpawnMultipleSelfMany.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleSelfMany.testArgsOnlyAtRoot) ... [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 22-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 22-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_26_129316938_virt32a [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_26_129316938_virt32a [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 22-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 26] got PMI command: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:26:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 26] got PMI command: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:26:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=put kvsname=kvs_27847_26_129316938_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D703476565A307A00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703476565A307A00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=put kvsname=kvs_27847_25_1938921375_virt32a key=P1-businesscard value=description#virt32a$port#52979$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#52979$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=put kvsname=kvs_27847_25_1938921375_virt32a key=P0-businesscard value=description#virt32a$port#57693$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#57693$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=put kvsname=kvs_27847_25_1938921375_virt32a key=P2-businesscard value=description#virt32a$port#39767$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#39767$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#52979$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#57693$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39767$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#52979$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#57693$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39767$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 25] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#52979$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#57693$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39767$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 25] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:25:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#52979$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#57693$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39767$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:25:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 23-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 23-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 23-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArgsBad (test_spawn.TestSpawnMultipleSelfMany.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleSelfMany.testArgsOnlyAtRoot) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_28_1707642857_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_27_1029562426_virt32a [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_27_1029562426_virt32a [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_27_1029562426_virt32a [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 27] got PMI command: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:27:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 27] got PMI command: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:27:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 27] got PMI command: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:27:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=put kvsname=kvs_27847_27_1029562426_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D704E7250384E7200 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704E7250384E7200 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_26_129316938_virt32a [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704E7250384E7200 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704E7250384E7200 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 27] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704E7250384E7200 [mpiexec@virt32a] [pgid: 27] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:27:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704E7250384E7200 [mpiexec@virt32a] Sending internal PMI command (proxy:27:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704E7250384E7200 found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704E7250384E7200 found=TRUE [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 26] got PMI command: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:26:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 25] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:25:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703476565A307A00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703476565A307A00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 26] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703476565A307A00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 26] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:26:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703476565A307A00 [mpiexec@virt32a] Sending internal PMI command (proxy:26:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D703476565A307A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D703476565A307A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 24-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 24-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 24-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#57693$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#52979$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=get kvsname=kvs_27847_25_1938921375_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#39767$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=put kvsname=kvs_27847_27_1029562426_virt32a key=P0-businesscard value=description#virt32a$port#60649$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#60649$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=put kvsname=kvs_27847_27_1029562426_virt32a key=P2-businesscard value=description#virt32a$port#39739$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#39739$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_28_1707642857_virt32a [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 28] got PMI command: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:28:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=put kvsname=kvs_27847_27_1029562426_virt32a key=P1-businesscard value=description#virt32a$port#60097$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#60097$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#60649$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39739$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#60097$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#60649$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39739$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#60097$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=barrier_in [mpiexec@virt32a] [pgid: 27] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#60649$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39739$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#60097$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 27] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:27:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#60649$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39739$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#60097$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:27:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=put kvsname=kvs_27847_26_129316938_virt32a key=P0-businesscard value=description#virt32a$port#51507$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#51507$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=put kvsname=kvs_27847_26_129316938_virt32a key=P2-businesscard value=description#virt32a$port#41109$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#41109$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=put kvsname=kvs_27847_26_129316938_virt32a key=P1-businesscard value=description#virt32a$port#33719$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#33719$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#51507$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41109$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33719$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#51507$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41109$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33719$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 26] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#51507$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41109$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33719$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 26] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:26:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#51507$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41109$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33719$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:26:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 27] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:27:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 26] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:26:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_28_1707642857_virt32a [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_28_1707642857_virt32a [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 28] got PMI command: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:28:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 28] got PMI command: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:28:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#60649$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#60097$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=put kvsname=kvs_27847_28_1707642857_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7074794B4A374B00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7074794B4A374B00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=get kvsname=kvs_27847_27_1029562426_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#39739$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#51507$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7074794B4A374B00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7074794B4A374B00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 28] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7074794B4A374B00 [mpiexec@virt32a] [pgid: 28] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:28:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7074794B4A374B00 [mpiexec@virt32a] Sending internal PMI command (proxy:28:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#33719$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7074794B4A374B00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=get kvsname=kvs_27847_26_129316938_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#41109$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7074794B4A374B00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=put kvsname=kvs_27847_28_1707642857_virt32a key=P2-businesscard value=description#virt32a$port#48931$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#48931$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=put kvsname=kvs_27847_28_1707642857_virt32a key=P0-businesscard value=description#virt32a$port#53621$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#53621$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=put kvsname=kvs_27847_28_1707642857_virt32a key=P1-businesscard value=description#virt32a$port#49143$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#49143$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#48931$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#53621$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#49143$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#48931$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#53621$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#49143$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 28] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#48931$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#53621$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#49143$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 28] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:28:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#48931$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#53621$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#49143$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:28:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 28] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:28:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#53621$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 8 --auto-cleanup 1 --pmi-kvsname kvs_27847_29_635828565_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,8)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawn (test_spawn.TestSpawnMultipleSelfMany.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#49143$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=get kvsname=kvs_27847_28_1707642857_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48931$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 25-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 25-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 25-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_29_635828565_virt32a [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_29_635828565_virt32a [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_29_635828565_virt32a [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_29_635828565_virt32a [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=barrier_in [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 8 --auto-cleanup 1 --pmi-kvsname kvs_27847_30_1601083786_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,8)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawn (test_spawn.TestSpawnMultipleSelfMany.testCommSpawn) ... [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=get_result rc=1 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=put kvsname=kvs_27847_29_635828565_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=barrier_in [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 8 --auto-cleanup 1 --pmi-kvsname kvs_27847_31_1997143611_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,8)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawn (test_spawn.TestSpawnMultipleSelfMany.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 27-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_29_635828565_virt32a [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 27-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 27-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_29_635828565_virt32a [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_29_635828565_virt32a [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_29_635828565_virt32a [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 26-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 26-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 26-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706F324A54636E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_31_1997143611_virt32a [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 8 --auto-cleanup 1 --pmi-kvsname kvs_27847_32_2082840723_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,8)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawn (test_spawn.TestSpawnMultipleSelfMany.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_30_1601083786_virt32a [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_30_1601083786_virt32a [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_32_2082840723_virt32a [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_31_1997143611_virt32a [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=barrier_in [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_30_1601083786_virt32a [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_30_1601083786_virt32a [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 28-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=put kvsname=kvs_27847_29_635828565_virt32a key=P0-businesscard value=description#virt32a$port#50059$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#50059$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=put kvsname=kvs_27847_29_635828565_virt32a key=P1-businesscard value=description#virt32a$port#51715$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#51715$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=put kvsname=kvs_27847_29_635828565_virt32a key=P5-businesscard value=description#virt32a$port#57849$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P5-businesscard=description#virt32a$port#57849$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_30_1601083786_virt32a [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_30_1601083786_virt32a [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 28-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=put kvsname=kvs_27847_29_635828565_virt32a key=P4-businesscard value=description#virt32a$port#47689$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P4-businesscard=description#virt32a$port#47689$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=put kvsname=kvs_27847_31_1997143611_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=put kvsname=kvs_27847_29_635828565_virt32a key=P2-businesscard value=description#virt32a$port#41051$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#41051$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=put kvsname=kvs_27847_29_635828565_virt32a key=P7-businesscard value=description#virt32a$port#59243$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P7-businesscard=description#virt32a$port#59243$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=put kvsname=kvs_27847_29_635828565_virt32a key=P3-businesscard value=description#virt32a$port#37579$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#37579$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=put kvsname=kvs_27847_29_635828565_virt32a key=P6-businesscard value=description#virt32a$port#47955$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P6-businesscard=description#virt32a$port#47955$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_31_1997143611_virt32a [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_30_1601083786_virt32a [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 28-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=barrier_in [proxy:0@virt32a] flushing 8 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#50059$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#51715$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#57849$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#47689$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41051$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#59243$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#37579$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#47955$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#50059$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#51715$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#57849$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#47689$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41051$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#59243$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#37579$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#47955$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=put kvsname=kvs_27847_30_1601083786_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_30_1601083786_virt32a [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#50059$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#51715$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#57849$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#47689$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41051$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#59243$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#37579$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#47955$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#50059$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#51715$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#57849$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#47689$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41051$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#59243$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#37579$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#47955$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_32_2082840723_virt32a [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_31_1997143611_virt32a [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_31_1997143611_virt32a [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_31_1997143611_virt32a [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_31_1997143611_virt32a [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_32_2082840723_virt32a [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=get_result rc=1 Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=barrier_in [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_31_1997143611_virt32a [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=barrier_in [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70364F6465544F00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=barrier_in [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70395A68796E6C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [mpiexec@virt32a] [pgid: 29] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:29:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_32_2082840723_virt32a [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_32_2082840723_virt32a [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_32_2082840723_virt32a [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_32_2082840723_virt32a [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_32_2082840723_virt32a [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=put kvsname=kvs_27847_32_2082840723_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7072664F4E444A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#50059$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#51715$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#41051$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#37579$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=P4-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47689$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=P5-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#57849$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=P6-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47955$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=get kvsname=kvs_27847_29_635828565_virt32a key=P7-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#59243$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=put kvsname=kvs_27847_30_1601083786_virt32a key=P3-businesscard value=description#virt32a$port#54845$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#54845$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=put kvsname=kvs_27847_30_1601083786_virt32a key=P4-businesscard value=description#virt32a$port#32789$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P4-businesscard=description#virt32a$port#32789$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=put kvsname=kvs_27847_30_1601083786_virt32a key=P5-businesscard value=description#virt32a$port#55091$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P5-businesscard=description#virt32a$port#55091$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=put kvsname=kvs_27847_30_1601083786_virt32a key=P1-businesscard value=description#virt32a$port#48127$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#48127$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=put kvsname=kvs_27847_30_1601083786_virt32a key=P0-businesscard value=description#virt32a$port#37753$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#37753$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=put kvsname=kvs_27847_30_1601083786_virt32a key=P2-businesscard value=description#virt32a$port#52169$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#52169$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=put kvsname=kvs_27847_30_1601083786_virt32a key=P7-businesscard value=description#virt32a$port#42169$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P7-businesscard=description#virt32a$port#42169$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=put kvsname=kvs_27847_31_1997143611_virt32a key=P6-businesscard value=description#virt32a$port#42149$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P6-businesscard=description#virt32a$port#42149$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=put kvsname=kvs_27847_31_1997143611_virt32a key=P7-businesscard value=description#virt32a$port#59079$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P7-businesscard=description#virt32a$port#59079$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=put kvsname=kvs_27847_31_1997143611_virt32a key=P0-businesscard value=description#virt32a$port#38301$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#38301$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=put kvsname=kvs_27847_30_1601083786_virt32a key=P6-businesscard value=description#virt32a$port#48113$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P6-businesscard=description#virt32a$port#48113$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=barrier_in [proxy:0@virt32a] flushing 8 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P3-businesscard=description#virt32a$port#54845$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#32789$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#55091$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#48127$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#37753$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#52169$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#42169$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#48113$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P3-businesscard=description#virt32a$port#54845$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#32789$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#55091$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#48127$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#37753$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#52169$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#42169$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#48113$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=barrier_in [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=mput P3-businesscard=description#virt32a$port#54845$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#32789$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#55091$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#48127$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#37753$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#52169$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#42169$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#48113$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=keyval_cache P3-businesscard=description#virt32a$port#54845$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#32789$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#55091$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#48127$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#37753$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#52169$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#42169$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#48113$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=put kvsname=kvs_27847_31_1997143611_virt32a key=P5-businesscard value=description#virt32a$port#49877$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P5-businesscard=description#virt32a$port#49877$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=put kvsname=kvs_27847_31_1997143611_virt32a key=P1-businesscard value=description#virt32a$port#33265$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#33265$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=put kvsname=kvs_27847_31_1997143611_virt32a key=P2-businesscard value=description#virt32a$port#51963$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#51963$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=put kvsname=kvs_27847_31_1997143611_virt32a key=P4-businesscard value=description#virt32a$port#59647$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P4-businesscard=description#virt32a$port#59647$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=put kvsname=kvs_27847_31_1997143611_virt32a key=P3-businesscard value=description#virt32a$port#51953$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#51953$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=barrier_in [proxy:0@virt32a] flushing 8 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P6-businesscard=description#virt32a$port#42149$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#59079$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38301$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#49877$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33265$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#51963$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#59647$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#51953$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P6-businesscard=description#virt32a$port#42149$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#59079$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38301$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#49877$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33265$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#51963$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#59647$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#51953$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=mput P6-businesscard=description#virt32a$port#42149$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#59079$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38301$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#49877$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33265$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#51963$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#59647$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#51953$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=keyval_cache P6-businesscard=description#virt32a$port#42149$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#59079$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38301$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#49877$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33265$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#51963$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#59647$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#51953$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=put kvsname=kvs_27847_32_2082840723_virt32a key=P2-businesscard value=description#virt32a$port#54867$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#54867$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=put kvsname=kvs_27847_32_2082840723_virt32a key=P4-businesscard value=description#virt32a$port#33503$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P4-businesscard=description#virt32a$port#33503$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=put kvsname=kvs_27847_32_2082840723_virt32a key=P0-businesscard value=description#virt32a$port#36127$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#36127$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=put kvsname=kvs_27847_32_2082840723_virt32a key=P3-businesscard value=description#virt32a$port#48661$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#48661$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=put kvsname=kvs_27847_32_2082840723_virt32a key=P5-businesscard value=description#virt32a$port#45013$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P5-businesscard=description#virt32a$port#45013$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=put kvsname=kvs_27847_32_2082840723_virt32a key=P1-businesscard value=description#virt32a$port#36263$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#36263$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=put kvsname=kvs_27847_32_2082840723_virt32a key=P7-businesscard value=description#virt32a$port#59587$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P7-businesscard=description#virt32a$port#59587$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=put kvsname=kvs_27847_32_2082840723_virt32a key=P6-businesscard value=description#virt32a$port#51741$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P6-businesscard=description#virt32a$port#51741$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=barrier_in [proxy:0@virt32a] flushing 8 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#54867$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#33503$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#36127$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#48661$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#45013$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36263$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#59587$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#51741$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#54867$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#33503$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#36127$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#48661$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#45013$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36263$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#59587$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#51741$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#54867$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#33503$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#36127$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#48661$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#45013$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36263$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#59587$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#51741$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#54867$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#33503$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#36127$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#48661$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#45013$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36263$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#59587$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#51741$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=barrier_in [mpiexec@virt32a] [pgid: 30] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:30:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 31] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:31:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#37753$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48127$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#52169$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#54845$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=P4-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#32789$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=P5-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#55091$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=P6-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48113$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=get kvsname=kvs_27847_30_1601083786_virt32a key=P7-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#42169$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 32] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:32:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#38301$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#33265$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#51963$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#51953$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=P4-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#59647$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=P5-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#49877$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=P6-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#42149$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=get kvsname=kvs_27847_31_1997143611_virt32a key=P7-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#59079$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36127$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36263$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#54867$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48661$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=P4-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#33503$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=P5-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45013$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=P6-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#51741$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=get kvsname=kvs_27847_32_2082840723_virt32a key=P7-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#59587$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_33_196606420_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleSelfMany.testCommSpawnDefaults1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 29-4: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-6: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-5: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 29-7: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_34_1361189330_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleSelfMany.testCommSpawnDefaults1) ... [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_35_509912824_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleSelfMany.testCommSpawnDefaults1) ... [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 31-6: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 31-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 31-7: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 31-4: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 31-5: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 31-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_36_1192818972_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleSelfMany.testCommSpawnDefaults1) ... [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 31-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 31-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_33_196606420_virt32a [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 30-5: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_33_196606420_virt32a [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 33] got PMI command: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:33:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 30-6: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 30-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 33] got PMI command: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:33:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 30-7: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 30-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 30-4: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=put kvsname=kvs_27847_33_196606420_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7035717850644700 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7035717850644700 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7035717850644700 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7035717850644700 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 33] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7035717850644700 [mpiexec@virt32a] [pgid: 33] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:33:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7035717850644700 [mpiexec@virt32a] Sending internal PMI command (proxy:33:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7035717850644700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=put kvsname=kvs_27847_33_196606420_virt32a key=P0-businesscard value=description#virt32a$port#47619$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#47619$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=put kvsname=kvs_27847_33_196606420_virt32a key=P1-businesscard value=description#virt32a$port#44947$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#44947$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#47619$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#44947$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#47619$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#44947$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 33] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#47619$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#44947$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 33] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:33:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#47619$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#44947$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:33:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 33] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:33:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 32-5: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_34_1361189330_virt32a [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_36_1192818972_virt32a [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 32-4: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 32-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 32-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 32-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 32-6: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 34] got PMI command: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:34:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 36] got PMI command: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:36:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_34_1361189330_virt32a [proxy:0@virt32a] got pmi command from downstream 32-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 32-7: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=put kvsname=kvs_27847_34_1361189330_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70484F3755633700 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70484F3755633700 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47619$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=barrier_in [mpiexec@virt32a] [pgid: 34] got PMI command: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:34:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=get kvsname=kvs_27847_33_196606420_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#44947$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_35_509912824_virt32a [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_35_509912824_virt32a [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 35] got PMI command: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:35:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 35] got PMI command: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:35:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70484F3755633700 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70484F3755633700 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 34] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70484F3755633700 [mpiexec@virt32a] [pgid: 34] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:34:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70484F3755633700 [mpiexec@virt32a] Sending internal PMI command (proxy:34:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=barrier_in [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70484F3755633700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=put kvsname=kvs_27847_35_509912824_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7078415041416500 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7078415041416500 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=put kvsname=kvs_27847_34_1361189330_virt32a key=P0-businesscard value=description#virt32a$port#53167$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#53167$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7078415041416500 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7078415041416500 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 35] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7078415041416500 [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=barrier_in [mpiexec@virt32a] [pgid: 35] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:35:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7078415041416500 [mpiexec@virt32a] Sending internal PMI command (proxy:35:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=put kvsname=kvs_27847_34_1361189330_virt32a key=P1-businesscard value=description#virt32a$port#46707$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#46707$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7078415041416500 found=TRUE [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#53167$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#46707$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#53167$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#46707$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 34] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#53167$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#46707$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 34] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:34:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#53167$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#46707$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:34:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_36_1192818972_virt32a [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 36] got PMI command: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:36:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=put kvsname=kvs_27847_36_1192818972_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D704C5734375A3200 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704C5734375A3200 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704C5734375A3200 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704C5734375A3200 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 36] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704C5734375A3200 [mpiexec@virt32a] [pgid: 36] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:36:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704C5734375A3200 [mpiexec@virt32a] Sending internal PMI command (proxy:36:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704C5734375A3200 found=TRUE [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 34] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:34:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=put kvsname=kvs_27847_35_509912824_virt32a key=P0-businesscard value=description#virt32a$port#46069$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#46069$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=put kvsname=kvs_27847_35_509912824_virt32a key=P1-businesscard value=description#virt32a$port#43477$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#43477$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#46069$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#43477$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#46069$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#43477$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 35] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#46069$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#43477$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 35] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:35:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#46069$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#43477$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:35:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=put kvsname=kvs_27847_36_1192818972_virt32a key=P1-businesscard value=description#virt32a$port#46041$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#46041$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#53167$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=put kvsname=kvs_27847_36_1192818972_virt32a key=P0-businesscard value=description#virt32a$port#41461$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#41461$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#46041$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#41461$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#46041$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#41461$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 36] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#46041$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#41461$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 36] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:36:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#46041$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#41461$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:36:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=get kvsname=kvs_27847_34_1361189330_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#46707$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 35] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:35:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 36] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:36:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#46069$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=get kvsname=kvs_27847_35_509912824_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#43477$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#41461$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=get kvsname=kvs_27847_36_1192818972_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#46041$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_37_1966705236_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleSelfMany.testCommSpawnDefaults2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 33-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 33-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_38_1173510473_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleSelfMany.testCommSpawnDefaults2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_39_1319141066_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleSelfMany.testCommSpawnDefaults2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 35-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 35-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 34-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 34-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_37_1966705236_virt32a [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_37_1966705236_virt32a [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 37] got PMI command: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:37:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_40_1537268055_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleSelfMany.testCommSpawnDefaults2) ... [mpiexec@virt32a] [pgid: 37] got PMI command: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:37:0): cmd=get_result rc=1 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=put kvsname=kvs_27847_37_1966705236_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D706E623073427800 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706E623073427800 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706E623073427800 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706E623073427800 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 37] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706E623073427800 [mpiexec@virt32a] [pgid: 37] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:37:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706E623073427800 [mpiexec@virt32a] Sending internal PMI command (proxy:37:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706E623073427800 found=TRUE [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_39_1319141066_virt32a [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_39_1319141066_virt32a [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_38_1173510473_virt32a [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 36-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=put kvsname=kvs_27847_37_1966705236_virt32a key=P0-businesscard value=description#virt32a$port#60393$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#60393$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=put kvsname=kvs_27847_37_1966705236_virt32a key=P1-businesscard value=description#virt32a$port#51809$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#51809$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 38] got PMI command: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:38:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 39] got PMI command: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:39:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#60393$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#51809$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#60393$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#51809$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 36-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [mpiexec@virt32a] [pgid: 37] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#60393$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#51809$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 37] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:37:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#60393$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#51809$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:37:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_38_1173510473_virt32a [mpiexec@virt32a] [pgid: 39] got PMI command: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:39:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=barrier_in [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=put kvsname=kvs_27847_39_1319141066_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7067416E67704100 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7067416E67704100 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7067416E67704100 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7067416E67704100 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 37] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:37:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 38] got PMI command: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:38:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 39] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7067416E67704100 [mpiexec@virt32a] [pgid: 39] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:39:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7067416E67704100 [mpiexec@virt32a] Sending internal PMI command (proxy:39:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7067416E67704100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=put kvsname=kvs_27847_38_1173510473_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70776A506B564300 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70776A506B564300 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70776A506B564300 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70776A506B564300 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 38] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70776A506B564300 [mpiexec@virt32a] [pgid: 38] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:38:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70776A506B564300 [mpiexec@virt32a] Sending internal PMI command (proxy:38:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=put kvsname=kvs_27847_39_1319141066_virt32a key=P0-businesscard value=description#virt32a$port#48421$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#48421$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=put kvsname=kvs_27847_39_1319141066_virt32a key=P1-businesscard value=description#virt32a$port#45209$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#45209$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70776A506B564300 found=TRUE [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#60393$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#48421$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45209$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#48421$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45209$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 39] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#48421$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45209$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 39] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:39:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#48421$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45209$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:39:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=get kvsname=kvs_27847_37_1966705236_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#51809$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 39] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:39:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=put kvsname=kvs_27847_38_1173510473_virt32a key=P0-businesscard value=description#virt32a$port#52957$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#52957$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=put kvsname=kvs_27847_38_1173510473_virt32a key=P1-businesscard value=description#virt32a$port#34967$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#34967$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_40_1537268055_virt32a [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#52957$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34967$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#52957$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34967$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 38] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#52957$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34967$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 38] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:38:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#52957$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#34967$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:38:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_40_1537268055_virt32a [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 40] got PMI command: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:40:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 38] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:38:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=put kvsname=kvs_27847_40_1537268055_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70357567386E7900 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70357567386E7900 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48421$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=get kvsname=kvs_27847_39_1319141066_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45209$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 40] got PMI command: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:40:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70357567386E7900 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70357567386E7900 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 40] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70357567386E7900 [mpiexec@virt32a] [pgid: 40] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:40:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70357567386E7900 [mpiexec@virt32a] Sending internal PMI command (proxy:40:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#52957$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70357567386E7900 found=TRUE [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=get kvsname=kvs_27847_38_1173510473_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34967$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=put kvsname=kvs_27847_40_1537268055_virt32a key=P0-businesscard value=description#virt32a$port#57847$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#57847$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=put kvsname=kvs_27847_40_1537268055_virt32a key=P1-businesscard value=description#virt32a$port#39541$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#39541$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#57847$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#39541$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#57847$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#39541$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 40] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#57847$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#39541$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 40] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:40:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#57847$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#39541$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:40:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 40] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:40:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#57847$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=get kvsname=kvs_27847_40_1537268055_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#39541$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_41_425415029_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnMultipleSelfMany.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 37-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_42_723927770_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnMultipleSelfMany.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 37-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_43_1131039355_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnMultipleSelfMany.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_44_780692626_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnMultipleSelfMany.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 38-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 38-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 39-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 39-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_41_425415029_virt32a [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_43_1131039355_virt32a [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 40-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 41] got PMI command: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:41:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 43] got PMI command: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:43:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 40-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_44_780692626_virt32a [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_43_1131039355_virt32a [mpiexec@virt32a] [pgid: 44] got PMI command: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:44:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_42_723927770_virt32a [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 42] got PMI command: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:42:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_42_723927770_virt32a [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 43] got PMI command: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:43:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=barrier_in [mpiexec@virt32a] [pgid: 42] got PMI command: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:42:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=put kvsname=kvs_27847_43_1131039355_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D703330546A716E00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703330546A716E00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=put kvsname=kvs_27847_42_723927770_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70526D6C50476500 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70526D6C50476500 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_41_425415029_virt32a [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_41_425415029_virt32a [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 41] got PMI command: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:41:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_44_780692626_virt32a [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [mpiexec@virt32a] [pgid: 41] got PMI command: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:41:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=put kvsname=kvs_27847_41_425415029_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7037544752626A00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7037544752626A00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 44] got PMI command: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:44:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7037544752626A00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7037544752626A00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 41] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7037544752626A00 [mpiexec@virt32a] [pgid: 41] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:41:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7037544752626A00 [mpiexec@virt32a] Sending internal PMI command (proxy:41:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7037544752626A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7037544752626A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_43_1131039355_virt32a [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 43] got PMI command: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:43:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_42_723927770_virt32a [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703330546A716E00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703330546A716E00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [mpiexec@virt32a] [pgid: 43] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703330546A716E00 [mpiexec@virt32a] [pgid: 43] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:43:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703330546A716E00 [mpiexec@virt32a] Sending internal PMI command (proxy:43:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D703330546A716E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_44_780692626_virt32a [mpiexec@virt32a] [pgid: 42] got PMI command: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:42:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70526D6C50476500 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70526D6C50476500 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D703330546A716E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=put kvsname=kvs_27847_41_425415029_virt32a key=P0-businesscard value=description#virt32a$port#35999$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#35999$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=put kvsname=kvs_27847_41_425415029_virt32a key=P1-businesscard value=description#virt32a$port#56725$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#56725$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=put kvsname=kvs_27847_41_425415029_virt32a key=P2-businesscard value=description#virt32a$port#50677$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#50677$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [mpiexec@virt32a] [pgid: 42] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70526D6C50476500 [mpiexec@virt32a] [pgid: 42] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:42:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70526D6C50476500 [mpiexec@virt32a] Sending internal PMI command (proxy:42:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#35999$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#56725$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#50677$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#35999$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#56725$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#50677$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 41] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#35999$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#56725$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#50677$ifname#127.0.1.1$ [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70526D6C50476500 found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70526D6C50476500 found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 41] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:41:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#35999$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#56725$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#50677$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:41:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 44] got PMI command: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:44:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=put kvsname=kvs_27847_44_780692626_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7033347639313000 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7033347639313000 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7033347639313000 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7033347639313000 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 44] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7033347639313000 [mpiexec@virt32a] [pgid: 44] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:44:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7033347639313000 [mpiexec@virt32a] Sending internal PMI command (proxy:44:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [mpiexec@virt32a] [pgid: 41] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:41:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7033347639313000 found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7033347639313000 found=TRUE [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=put kvsname=kvs_27847_42_723927770_virt32a key=P0-businesscard value=description#virt32a$port#34429$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#34429$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=put kvsname=kvs_27847_42_723927770_virt32a key=P1-businesscard value=description#virt32a$port#47657$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#47657$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=put kvsname=kvs_27847_43_1131039355_virt32a key=P1-businesscard value=description#virt32a$port#58551$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#58551$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=put kvsname=kvs_27847_43_1131039355_virt32a key=P2-businesscard value=description#virt32a$port#47811$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#47811$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=put kvsname=kvs_27847_42_723927770_virt32a key=P2-businesscard value=description#virt32a$port#35125$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#35125$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=put kvsname=kvs_27847_43_1131039355_virt32a key=P0-businesscard value=description#virt32a$port#56979$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#56979$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35999$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#34429$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#47657$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#35125$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#34429$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#47657$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#35125$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#58551$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#47811$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#56979$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#58551$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#47811$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#56979$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 42] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#34429$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#47657$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#35125$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 42] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:42:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#34429$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#47657$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#35125$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:42:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 43] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#58551$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#47811$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#56979$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 43] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:43:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#58551$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#47811$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#56979$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:43:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#56725$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=get kvsname=kvs_27847_41_425415029_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#50677$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 42] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:42:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=put kvsname=kvs_27847_44_780692626_virt32a key=P0-businesscard value=description#virt32a$port#37905$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#37905$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=put kvsname=kvs_27847_44_780692626_virt32a key=P1-businesscard value=description#virt32a$port#49725$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#49725$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=put kvsname=kvs_27847_44_780692626_virt32a key=P2-businesscard value=description#virt32a$port#56001$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#56001$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#37905$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#49725$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#56001$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#37905$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#49725$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#56001$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 43] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:43:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 44] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#37905$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#49725$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#56001$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 44] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:44:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#37905$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#49725$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#56001$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:44:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34429$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47657$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=get kvsname=kvs_27847_42_723927770_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35125$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#56979$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 44] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:44:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#58551$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=get kvsname=kvs_27847_43_1131039355_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47811$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#37905$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#49725$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=get kvsname=kvs_27847_44_780692626_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#56001$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnMultipleSelfMany.testNoArgs) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-gcnfd0ei.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-gcnfd0ei.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-gcnfd0ei.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-gcnfd0ei.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_45_1949372476_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-gcnfd0ei.py --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-gcnfd0ei.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 42-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 42-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 42-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnMultipleSelfMany.testNoArgs) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-orrmos64.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-orrmos64.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-orrmos64.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-orrmos64.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_46_25726626_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-orrmos64.py --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-orrmos64.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-h2i6n1hr.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-h2i6n1hr.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-h2i6n1hr.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-h2i6n1hr.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_47_141317762_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-h2i6n1hr.py --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-h2i6n1hr.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testNoArgs (test_spawn.TestSpawnMultipleSelfMany.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnMultipleSelfMany.testNoArgs) ... [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_45_1949372476_virt32a [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 45] got PMI command: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:45:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-6vpj1ene.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-6vpj1ene.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-6vpj1ene.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-6vpj1ene.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_48_690898473_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-6vpj1ene.py --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-6vpj1ene.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 41-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_45_1949372476_virt32a [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 41-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 45] got PMI command: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:45:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 41-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 43-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 43-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_45_1949372476_virt32a [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=put kvsname=kvs_27847_45_1949372476_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70354332564A3000 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70354332564A3000 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 43-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 44-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 45] got PMI command: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:45:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 44-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70354332564A3000 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70354332564A3000 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 45] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70354332564A3000 [mpiexec@virt32a] [pgid: 45] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:45:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70354332564A3000 [mpiexec@virt32a] Sending internal PMI command (proxy:45:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70354332564A3000 found=TRUE [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70354332564A3000 found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_47_141317762_virt32a [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 47] got PMI command: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:47:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=put kvsname=kvs_27847_45_1949372476_virt32a key=P2-businesscard value=description#virt32a$port#37945$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#37945$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=put kvsname=kvs_27847_45_1949372476_virt32a key=P1-businesscard value=description#virt32a$port#37923$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#37923$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=put kvsname=kvs_27847_45_1949372476_virt32a key=P0-businesscard value=description#virt32a$port#48489$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#48489$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#37945$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#37923$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#48489$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#37945$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#37923$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#48489$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_46_25726626_virt32a [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_46_25726626_virt32a [mpiexec@virt32a] [pgid: 45] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#37945$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#37923$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#48489$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 45] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:45:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#37945$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#37923$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#48489$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:45:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 45] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:45:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 46] got PMI command: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:46:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 46] got PMI command: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:46:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=put kvsname=kvs_27847_46_25726626_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D705336474A6A4900 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705336474A6A4900 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_46_25726626_virt32a [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_47_141317762_virt32a [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_47_141317762_virt32a [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48489$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#37923$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_48_690898473_virt32a [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=get kvsname=kvs_27847_45_1949372476_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#37945$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_48_690898473_virt32a [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_48_690898473_virt32a [mpiexec@virt32a] [pgid: 46] got PMI command: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:46:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 47] got PMI command: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:47:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 48] got PMI command: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:48:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=put kvsname=kvs_27847_48_690898473_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D705161556D493000 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705161556D493000 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [mpiexec@virt32a] [pgid: 47] got PMI command: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:47:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705336474A6A4900 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705336474A6A4900 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=put kvsname=kvs_27847_47_141317762_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D703250584D547600 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703250584D547600 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 46] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705336474A6A4900 [mpiexec@virt32a] [pgid: 46] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:46:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705336474A6A4900 [mpiexec@virt32a] Sending internal PMI command (proxy:46:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 48] got PMI command: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:48:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 48] got PMI command: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:48:0): cmd=get_result rc=1 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D705336474A6A4900 found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D705336474A6A4900 found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703250584D547600 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703250584D547600 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=barrier_in [mpiexec@virt32a] [pgid: 47] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703250584D547600 [mpiexec@virt32a] [pgid: 47] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:47:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D703250584D547600 [mpiexec@virt32a] Sending internal PMI command (proxy:47:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D703250584D547600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705161556D493000 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705161556D493000 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 48] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705161556D493000 [mpiexec@virt32a] [pgid: 48] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:48:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705161556D493000 [mpiexec@virt32a] Sending internal PMI command (proxy:48:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D705161556D493000 found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D705161556D493000 found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D703250584D547600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=put kvsname=kvs_27847_47_141317762_virt32a key=P0-businesscard value=description#virt32a$port#45839$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#45839$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=put kvsname=kvs_27847_47_141317762_virt32a key=P1-businesscard value=description#virt32a$port#45639$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#45639$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=put kvsname=kvs_27847_47_141317762_virt32a key=P2-businesscard value=description#virt32a$port#59671$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#59671$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=put kvsname=kvs_27847_48_690898473_virt32a key=P0-businesscard value=description#virt32a$port#47831$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#47831$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=put kvsname=kvs_27847_48_690898473_virt32a key=P1-businesscard value=description#virt32a$port#47449$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#47449$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=put kvsname=kvs_27847_48_690898473_virt32a key=P2-businesscard value=description#virt32a$port#53335$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#53335$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#45839$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45639$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#59671$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#45839$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45639$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#59671$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#47831$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#47449$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#53335$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#47831$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#47449$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#53335$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=put kvsname=kvs_27847_46_25726626_virt32a key=P2-businesscard value=description#virt32a$port#48433$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#48433$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=put kvsname=kvs_27847_46_25726626_virt32a key=P1-businesscard value=description#virt32a$port#46471$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#46471$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=barrier_in [mpiexec@virt32a] [pgid: 47] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#45839$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45639$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#59671$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 47] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:47:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#45839$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45639$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#59671$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:47:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 46-2: [mpiexec@virt32a] [pgid: 48] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#47831$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#47449$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#53335$ifname#127.0.1.1$ cmd=barrier_in [mpiexec@virt32a] [pgid: 48] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:48:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#47831$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#47449$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#53335$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:48:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=put kvsname=kvs_27847_46_25726626_virt32a key=P0-businesscard value=description#virt32a$port#39425$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#39425$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#48433$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#46471$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#39425$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#48433$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#46471$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#39425$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=barrier_in [mpiexec@virt32a] [pgid: 46] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#48433$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#46471$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#39425$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 46] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:46:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#48433$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#46471$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#39425$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:46:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=barrier_in [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 48] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:48:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 47] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:47:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 46] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:46:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47831$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47449$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=get kvsname=kvs_27847_48_690898473_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#53335$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45839$ifname#127.0.1.1$ found=TRUE ok testArgsBad (test_spawn.TestSpawnMultipleWorld.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleWorld.testArgsOnlyAtRoot) ... [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45639$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=get kvsname=kvs_27847_47_141317762_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#59671$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#39425$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#46471$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=get kvsname=kvs_27847_46_25726626_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48433$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 45-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 45-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 45-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testArgsBad (test_spawn.TestSpawnMultipleWorld.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleWorld.testArgsOnlyAtRoot) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArgsBad (test_spawn.TestSpawnMultipleWorld.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleWorld.testArgsOnlyAtRoot) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI ok testArgsBad (test_spawn.TestSpawnMultipleWorld.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleWorld.testArgsOnlyAtRoot) ... [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_49_975053529_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 48-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 48-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 48-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 47-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 47-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 47-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 46-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 46-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 46-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_49_975053529_virt32a [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 49] got PMI command: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:49:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_49_975053529_virt32a [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 49] got PMI command: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:49:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=put kvsname=kvs_27847_49_975053529_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70684A35566D5400 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70684A35566D5400 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_49_975053529_virt32a [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 49] got PMI command: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:49:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70684A35566D5400 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70684A35566D5400 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 49] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70684A35566D5400 [mpiexec@virt32a] [pgid: 49] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:49:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70684A35566D5400 [mpiexec@virt32a] Sending internal PMI command (proxy:49:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70684A35566D5400 found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70684A35566D5400 found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=put kvsname=kvs_27847_49_975053529_virt32a key=P1-businesscard value=description#virt32a$port#56075$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#56075$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=put kvsname=kvs_27847_49_975053529_virt32a key=P2-businesscard value=description#virt32a$port#41379$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#41379$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=put kvsname=kvs_27847_49_975053529_virt32a key=P0-businesscard value=description#virt32a$port#51341$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#51341$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#56075$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41379$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#51341$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#56075$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41379$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#51341$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 49] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#56075$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41379$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#51341$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 49] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:49:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#56075$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41379$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#51341$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:49:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 49] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:49:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#51341$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#56075$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=get kvsname=kvs_27847_49_975053529_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#41379$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_50_416622589_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawn (test_spawn.TestSpawnMultipleWorld.testCommSpawn) ... ok testCommSpawn (test_spawn.TestSpawnMultipleWorld.testCommSpawn) ... ok testCommSpawn (test_spawn.TestSpawnMultipleWorld.testCommSpawn) ... ok testCommSpawn (test_spawn.TestSpawnMultipleWorld.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 49-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 49-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 49-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_50_416622589_virt32a [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 50] got PMI command: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:50:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_50_416622589_virt32a [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 50] got PMI command: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:50:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=put kvsname=kvs_27847_50_416622589_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D705A36496B374C00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705A36496B374C00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705A36496B374C00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705A36496B374C00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 50] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705A36496B374C00 [mpiexec@virt32a] [pgid: 50] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:50:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705A36496B374C00 [mpiexec@virt32a] Sending internal PMI command (proxy:50:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D705A36496B374C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=put kvsname=kvs_27847_50_416622589_virt32a key=P0-businesscard value=description#virt32a$port#37233$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#37233$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=put kvsname=kvs_27847_50_416622589_virt32a key=P1-businesscard value=description#virt32a$port#38659$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#38659$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#37233$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#38659$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#37233$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#38659$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 50] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#37233$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#38659$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 50] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:50:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#37233$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#38659$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:50:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 50] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:50:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#37233$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=get kvsname=kvs_27847_50_416622589_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#38659$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleWorld.testCommSpawnDefaults1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleWorld.testCommSpawnDefaults1) ... ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleWorld.testCommSpawnDefaults1) ... ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleWorld.testCommSpawnDefaults1) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_51_1569452648_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 50-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 50-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_51_1569452648_virt32a [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 51] got PMI command: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:51:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_51_1569452648_virt32a [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 51] got PMI command: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:51:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=put kvsname=kvs_27847_51_1569452648_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D707432644E746100 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707432644E746100 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707432644E746100 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707432644E746100 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 51] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707432644E746100 [mpiexec@virt32a] [pgid: 51] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:51:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707432644E746100 [mpiexec@virt32a] Sending internal PMI command (proxy:51:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D707432644E746100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=put kvsname=kvs_27847_51_1569452648_virt32a key=P0-businesscard value=description#virt32a$port#48271$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#48271$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=put kvsname=kvs_27847_51_1569452648_virt32a key=P1-businesscard value=description#virt32a$port#57251$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#57251$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#48271$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#57251$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#48271$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#57251$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 51] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#48271$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#57251$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 51] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:51:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#48271$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#57251$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:51:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 51] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:51:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48271$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=get kvsname=kvs_27847_51_1569452648_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#57251$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 51-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 51-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleWorld.testCommSpawnDefaults2) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleWorld.testCommSpawnDefaults2) ... ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleWorld.testCommSpawnDefaults2) ... [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_52_1289001122_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleWorld.testCommSpawnDefaults2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_52_1289001122_virt32a [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 52] got PMI command: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:52:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=put kvsname=kvs_27847_52_1289001122_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7054376A374E4400 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7054376A374E4400 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_52_1289001122_virt32a [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 52] got PMI command: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:52:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7054376A374E4400 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7054376A374E4400 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 52] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7054376A374E4400 [mpiexec@virt32a] [pgid: 52] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:52:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7054376A374E4400 [mpiexec@virt32a] Sending internal PMI command (proxy:52:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7054376A374E4400 found=TRUE [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=put kvsname=kvs_27847_52_1289001122_virt32a key=P0-businesscard value=description#virt32a$port#40049$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#40049$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=put kvsname=kvs_27847_52_1289001122_virt32a key=P1-businesscard value=description#virt32a$port#46311$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#46311$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#40049$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#46311$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#40049$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#46311$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 52] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#40049$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#46311$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 52] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:52:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#40049$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#46311$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:52:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 52] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:52:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#40049$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=get kvsname=kvs_27847_52_1289001122_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#46311$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrcodes (test_spawn.TestSpawnMultipleWorld.testErrcodes) ... ok testErrcodes (test_spawn.TestSpawnMultipleWorld.testErrcodes) ... ok testErrcodes (test_spawn.TestSpawnMultipleWorld.testErrcodes) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_53_478521120_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnMultipleWorld.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 52-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 52-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_53_478521120_virt32a [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_53_478521120_virt32a [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 53] got PMI command: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:53:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 53] got PMI command: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:53:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_53_478521120_virt32a [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 53] got PMI command: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:53:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=put kvsname=kvs_27847_53_478521120_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D705456454C316500 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705456454C316500 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705456454C316500 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705456454C316500 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 53] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705456454C316500 [mpiexec@virt32a] [pgid: 53] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:53:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D705456454C316500 [mpiexec@virt32a] Sending internal PMI command (proxy:53:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D705456454C316500 found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D705456454C316500 found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=put kvsname=kvs_27847_53_478521120_virt32a key=P0-businesscard value=description#virt32a$port#52527$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#52527$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=put kvsname=kvs_27847_53_478521120_virt32a key=P2-businesscard value=description#virt32a$port#48097$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#48097$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=put kvsname=kvs_27847_53_478521120_virt32a key=P1-businesscard value=description#virt32a$port#42247$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#42247$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#52527$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48097$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#42247$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#52527$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48097$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#42247$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 53] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#52527$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48097$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#42247$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 53] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:53:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#52527$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48097$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#42247$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:53:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 53] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:53:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#52527$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#42247$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=get kvsname=kvs_27847_53_478521120_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48097$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnMultipleWorld.testNoArgs) ... ok testNoArgs (test_spawn.TestSpawnMultipleWorld.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnMultipleWorld.testNoArgs) ... ok testNoArgs (test_spawn.TestSpawnMultipleWorld.testNoArgs) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-2us5jqxr.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-2us5jqxr.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-2us5jqxr.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-2us5jqxr.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_54_1757216952_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-2us5jqxr.py --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-2us5jqxr.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 53-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 53-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 53-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_54_1757216952_virt32a [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 54] got PMI command: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:54:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_54_1757216952_virt32a [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_54_1757216952_virt32a [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 54] got PMI command: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:54:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 54] got PMI command: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:54:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=put kvsname=kvs_27847_54_1757216952_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70746844554F7400 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70746844554F7400 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70746844554F7400 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70746844554F7400 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 54] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70746844554F7400 [mpiexec@virt32a] [pgid: 54] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:54:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70746844554F7400 [mpiexec@virt32a] Sending internal PMI command (proxy:54:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70746844554F7400 found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70746844554F7400 found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=put kvsname=kvs_27847_54_1757216952_virt32a key=P0-businesscard value=description#virt32a$port#34777$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#34777$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=put kvsname=kvs_27847_54_1757216952_virt32a key=P1-businesscard value=description#virt32a$port#45723$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#45723$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=put kvsname=kvs_27847_54_1757216952_virt32a key=P2-businesscard value=description#virt32a$port#48041$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#48041$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#34777$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45723$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48041$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#34777$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45723$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48041$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 54] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#34777$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45723$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48041$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 54] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:54:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#34777$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45723$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48041$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:54:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 54] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:54:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34777$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45723$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=get kvsname=kvs_27847_54_1757216952_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48041$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArgsBad (test_spawn.TestSpawnMultipleWorldMany.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleWorldMany.testArgsOnlyAtRoot) ... ok testArgsBad (test_spawn.TestSpawnMultipleWorldMany.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleWorldMany.testArgsOnlyAtRoot) ... ok testArgsBad (test_spawn.TestSpawnMultipleWorldMany.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleWorldMany.testArgsOnlyAtRoot) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArgsBad (test_spawn.TestSpawnMultipleWorldMany.testArgsBad) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnMultipleWorldMany.testArgsOnlyAtRoot) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_55_681869342_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 54-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 54-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 54-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_55_681869342_virt32a [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 55] got PMI command: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:55:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=put kvsname=kvs_27847_55_681869342_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7078306643764A00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7078306643764A00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_55_681869342_virt32a [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_55_681869342_virt32a [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 55] got PMI command: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:55:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 55] got PMI command: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:55:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7078306643764A00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7078306643764A00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 55] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7078306643764A00 [mpiexec@virt32a] [pgid: 55] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:55:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7078306643764A00 [mpiexec@virt32a] Sending internal PMI command (proxy:55:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7078306643764A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7078306643764A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=put kvsname=kvs_27847_55_681869342_virt32a key=P1-businesscard value=description#virt32a$port#59837$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#59837$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=put kvsname=kvs_27847_55_681869342_virt32a key=P2-businesscard value=description#virt32a$port#48179$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#48179$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=put kvsname=kvs_27847_55_681869342_virt32a key=P0-businesscard value=description#virt32a$port#33505$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#33505$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#59837$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48179$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#33505$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#59837$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48179$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#33505$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 55] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#59837$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48179$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#33505$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 55] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:55:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#59837$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48179$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#33505$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:55:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 55] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:55:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#33505$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#59837$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=get kvsname=kvs_27847_55_681869342_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48179$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommSpawn (test_spawn.TestSpawnMultipleWorldMany.testCommSpawn) ... ok testCommSpawn (test_spawn.TestSpawnMultipleWorldMany.testCommSpawn) ... ok testCommSpawn (test_spawn.TestSpawnMultipleWorldMany.testCommSpawn) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCommSpawn (test_spawn.TestSpawnMultipleWorldMany.testCommSpawn) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 8 --auto-cleanup 1 --pmi-kvsname kvs_27847_56_269958847_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,8)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 55-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 55-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 55-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_56_269958847_virt32a [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_56_269958847_virt32a [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=put kvsname=kvs_27847_56_269958847_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_56_269958847_virt32a [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_56_269958847_virt32a [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_56_269958847_virt32a [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_56_269958847_virt32a [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_56_269958847_virt32a [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_56_269958847_virt32a [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,8)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706B7264734B5600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=put kvsname=kvs_27847_56_269958847_virt32a key=P3-businesscard value=description#virt32a$port#38105$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#38105$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=put kvsname=kvs_27847_56_269958847_virt32a key=P5-businesscard value=description#virt32a$port#41885$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P5-businesscard=description#virt32a$port#41885$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=put kvsname=kvs_27847_56_269958847_virt32a key=P6-businesscard value=description#virt32a$port#38747$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P6-businesscard=description#virt32a$port#38747$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=put kvsname=kvs_27847_56_269958847_virt32a key=P0-businesscard value=description#virt32a$port#46079$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#46079$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=put kvsname=kvs_27847_56_269958847_virt32a key=P7-businesscard value=description#virt32a$port#60959$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P7-businesscard=description#virt32a$port#60959$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=put kvsname=kvs_27847_56_269958847_virt32a key=P1-businesscard value=description#virt32a$port#36933$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#36933$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=put kvsname=kvs_27847_56_269958847_virt32a key=P2-businesscard value=description#virt32a$port#41117$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#41117$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=put kvsname=kvs_27847_56_269958847_virt32a key=P4-businesscard value=description#virt32a$port#57607$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P4-businesscard=description#virt32a$port#57607$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=barrier_in [proxy:0@virt32a] flushing 8 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P3-businesscard=description#virt32a$port#38105$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#41885$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#38747$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#46079$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#60959$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36933$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41117$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#57607$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P3-businesscard=description#virt32a$port#38105$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#41885$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#38747$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#46079$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#60959$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36933$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41117$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#57607$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=mput P3-businesscard=description#virt32a$port#38105$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#41885$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#38747$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#46079$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#60959$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36933$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41117$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#57607$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=keyval_cache P3-businesscard=description#virt32a$port#38105$ifname#127.0.1.1$ P5-businesscard=description#virt32a$port#41885$ifname#127.0.1.1$ P6-businesscard=description#virt32a$port#38747$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#46079$ifname#127.0.1.1$ P7-businesscard=description#virt32a$port#60959$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#36933$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#41117$ifname#127.0.1.1$ P4-businesscard=description#virt32a$port#57607$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 56] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:56:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#46079$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36933$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#41117$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#38105$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=P4-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#57607$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=P5-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#41885$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=P6-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#38747$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=get kvsname=kvs_27847_56_269958847_virt32a key=P7-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#60959$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleWorldMany.testCommSpawnDefaults1) ... ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleWorldMany.testCommSpawnDefaults1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleWorldMany.testCommSpawnDefaults1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommSpawnDefaults1 (test_spawn.TestSpawnMultipleWorldMany.testCommSpawnDefaults1) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_57_1886533891_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 56-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 56-7: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 56-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 56-4: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 56-5: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 56-6: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 56-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 56-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_57_1886533891_virt32a [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 57] got PMI command: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:57:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_57_1886533891_virt32a [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 57] got PMI command: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:57:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=put kvsname=kvs_27847_57_1886533891_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7035673734445000 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7035673734445000 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7035673734445000 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7035673734445000 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 57] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7035673734445000 [mpiexec@virt32a] [pgid: 57] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:57:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7035673734445000 [mpiexec@virt32a] Sending internal PMI command (proxy:57:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7035673734445000 found=TRUE [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=put kvsname=kvs_27847_57_1886533891_virt32a key=P0-businesscard value=description#virt32a$port#48389$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#48389$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=put kvsname=kvs_27847_57_1886533891_virt32a key=P1-businesscard value=description#virt32a$port#33491$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#33491$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#48389$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33491$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#48389$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33491$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 57] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#48389$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33491$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 57] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:57:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#48389$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33491$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:57:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 57] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:57:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48389$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=get kvsname=kvs_27847_57_1886533891_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#33491$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleWorldMany.testCommSpawnDefaults2) ... ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleWorldMany.testCommSpawnDefaults2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleWorldMany.testCommSpawnDefaults2) ... ok testCommSpawnDefaults2 (test_spawn.TestSpawnMultipleWorldMany.testCommSpawnDefaults2) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 2 --auto-cleanup 1 --pmi-kvsname kvs_27847_58_1711431768_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,2)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 57-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 57-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_58_1711431768_virt32a [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_58_1711431768_virt32a [mpiexec@virt32a] [pgid: 58] got PMI command: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:58:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=put kvsname=kvs_27847_58_1711431768_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70306B3437716B00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70306B3437716B00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,2)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 58] got PMI command: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:58:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70306B3437716B00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70306B3437716B00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 58] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70306B3437716B00 [mpiexec@virt32a] [pgid: 58] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:58:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70306B3437716B00 [mpiexec@virt32a] Sending internal PMI command (proxy:58:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70306B3437716B00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=put kvsname=kvs_27847_58_1711431768_virt32a key=P0-businesscard value=description#virt32a$port#54201$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#54201$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=put kvsname=kvs_27847_58_1711431768_virt32a key=P1-businesscard value=description#virt32a$port#53481$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#53481$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=barrier_in [proxy:0@virt32a] flushing 2 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#54201$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#53481$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#54201$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#53481$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 58] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#54201$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#53481$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 58] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:58:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#54201$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#53481$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:58:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 58] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:58:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#54201$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=get kvsname=kvs_27847_58_1711431768_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#53481$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrcodes (test_spawn.TestSpawnMultipleWorldMany.testErrcodes) ... ok testErrcodes (test_spawn.TestSpawnMultipleWorldMany.testErrcodes) ... ok testErrcodes (test_spawn.TestSpawnMultipleWorldMany.testErrcodes) ... ok testErrcodes (test_spawn.TestSpawnMultipleWorldMany.testErrcodes) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/usr/bin/python3.12 totspawns=2 spawnssofar=2 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_59_1977601704_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] got pmi command from downstream 58-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 58-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_59_1977601704_virt32a [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 59] got PMI command: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:59:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=put kvsname=kvs_27847_59_1977601704_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70756B4231505100 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70756B4231505100 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_59_1977601704_virt32a [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_59_1977601704_virt32a [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 59] got PMI command: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:59:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 59] got PMI command: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:59:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70756B4231505100 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70756B4231505100 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 59] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70756B4231505100 [mpiexec@virt32a] [pgid: 59] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:59:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70756B4231505100 [mpiexec@virt32a] Sending internal PMI command (proxy:59:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70756B4231505100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70756B4231505100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=put kvsname=kvs_27847_59_1977601704_virt32a key=P0-businesscard value=description#virt32a$port#36319$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#36319$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=put kvsname=kvs_27847_59_1977601704_virt32a key=P2-businesscard value=description#virt32a$port#39395$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#39395$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=put kvsname=kvs_27847_59_1977601704_virt32a key=P1-businesscard value=description#virt32a$port#50307$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#50307$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#36319$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39395$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#50307$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#36319$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39395$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#50307$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 59] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#36319$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39395$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#50307$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 59] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:59:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#36319$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#39395$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#50307$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:59:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 59] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:59:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36319$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#50307$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=get kvsname=kvs_27847_59_1977601704_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#39395$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnMultipleWorldMany.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnMultipleWorldMany.testNoArgs) ... ok testNoArgs (test_spawn.TestSpawnMultipleWorldMany.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnMultipleWorldMany.testNoArgs) ... [proxy:0@virt32a] got pmi command from downstream 59-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-xdl8b8dn.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-xdl8b8dn.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 59-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-xdl8b8dn.py totspawns=2 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd mcmd=spawn nprocs=2 execname=/tmp/mpi4py-xdl8b8dn.py totspawns=2 spawnssofar=2 argcnt=0 info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 3 --auto-cleanup 1 --pmi-kvsname kvs_27847_60_374878808_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,3)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-xdl8b8dn.py --exec --exec-appnum 1 --exec-proc-count 2 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-xdl8b8dn.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] got pmi command from downstream 59-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_60_374878808_virt32a [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 60] got PMI command: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:60:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_60_374878808_virt32a [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=1 [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_60_374878808_virt32a [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 60] got PMI command: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:60:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,3)) found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 60] got PMI command: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:60:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=put kvsname=kvs_27847_60_374878808_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7061726D556B6B00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7061726D556B6B00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7061726D556B6B00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7061726D556B6B00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 60] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7061726D556B6B00 [mpiexec@virt32a] [pgid: 60] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:60:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7061726D556B6B00 [mpiexec@virt32a] Sending internal PMI command (proxy:60:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7061726D556B6B00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7061726D556B6B00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=put kvsname=kvs_27847_60_374878808_virt32a key=P1-businesscard value=description#virt32a$port#55045$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#55045$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=put kvsname=kvs_27847_60_374878808_virt32a key=P2-businesscard value=description#virt32a$port#48039$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#48039$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=put kvsname=kvs_27847_60_374878808_virt32a key=P0-businesscard value=description#virt32a$port#47579$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#47579$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=barrier_in [proxy:0@virt32a] flushing 3 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#55045$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48039$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47579$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#55045$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48039$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47579$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 60] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#55045$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48039$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47579$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 60] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:60:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#55045$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#48039$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47579$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:60:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 60] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:60:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47579$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#55045$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=get kvsname=kvs_27847_60_374878808_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48039$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleSelf.testArgsOnlyAtRoot) ... [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_61_1165031907_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_62_1827261668_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 60-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 60-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 60-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_63_310235883_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleSelf.testArgsOnlyAtRoot) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleSelf.testArgsOnlyAtRoot) ... [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_64_1361638327_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleSelf.testArgsOnlyAtRoot) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 61-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 61-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 61-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 61-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_61_1165031907_virt32a [proxy:0@virt32a] got pmi command from downstream 61-0: cmd=get kvsname=kvs_27847_61_1165031907_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 63-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 61-0: cmd=get kvsname=kvs_27847_61_1165031907_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_61_1165031907_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 63-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 61] got PMI command: cmd=get kvsname=kvs_27847_61_1165031907_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:61:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 63-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 63-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_63_310235883_virt32a [proxy:0@virt32a] got pmi command from downstream 63-0: cmd=get kvsname=kvs_27847_63_310235883_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 61-0: cmd=put kvsname=kvs_27847_61_1165031907_virt32a key=P0-businesscard value=description#virt32a$port#54533$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#54533$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 63-0: cmd=get kvsname=kvs_27847_63_310235883_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_63_310235883_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 62-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 63] got PMI command: cmd=get kvsname=kvs_27847_63_310235883_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:63:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 62-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 61-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#54533$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#54533$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 61] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#54533$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 61] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:61:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#54533$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:61:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 62-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 62-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_62_1827261668_virt32a [proxy:0@virt32a] got pmi command from downstream 62-0: cmd=get kvsname=kvs_27847_62_1827261668_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 62-0: cmd=get kvsname=kvs_27847_62_1827261668_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_62_1827261668_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 64-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 62] got PMI command: cmd=get kvsname=kvs_27847_62_1827261668_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:62:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 64-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 64-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 63-0: cmd=put kvsname=kvs_27847_63_310235883_virt32a key=P0-businesscard value=description#virt32a$port#49885$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#49885$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 61-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 61] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:61:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 64-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_64_1361638327_virt32a [proxy:0@virt32a] got pmi command from downstream 64-0: cmd=get kvsname=kvs_27847_64_1361638327_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 63-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#49885$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#49885$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 63] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#49885$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 63] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:63:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#49885$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:63:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 64-0: cmd=get kvsname=kvs_27847_64_1361638327_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_64_1361638327_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 64] got PMI command: cmd=get kvsname=kvs_27847_64_1361638327_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:64:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 61-0: cmd=get kvsname=kvs_27847_61_1165031907_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 63-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 63] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:63:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 62-0: cmd=put kvsname=kvs_27847_62_1827261668_virt32a key=P0-businesscard value=description#virt32a$port#48361$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#48361$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 63-0: cmd=get kvsname=kvs_27847_63_310235883_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 62-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#48361$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#48361$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 62] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#48361$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 62] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:62:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#48361$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:62:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 62-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 62] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:62:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 64-0: cmd=put kvsname=kvs_27847_64_1361638327_virt32a key=P0-businesscard value=description#virt32a$port#57001$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#57001$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 64-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#57001$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#57001$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 64] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#57001$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 64] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:64:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#57001$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:64:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 61-0: cmd=get kvsname=kvs_27847_61_1165031907_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#54533$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 62-0: cmd=get kvsname=kvs_27847_62_1827261668_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 63-0: cmd=get kvsname=kvs_27847_63_310235883_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#49885$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 64-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 64] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:64:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 64-0: cmd=get kvsname=kvs_27847_64_1361638327_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 62-0: cmd=get kvsname=kvs_27847_62_1827261668_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48361$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 64-0: cmd=get kvsname=kvs_27847_64_1361638327_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#57001$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_65_1040967350_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawn (test_spawn.TestSpawnSingleSelf.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_66_820148707_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawn (test_spawn.TestSpawnSingleSelf.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_67_406973651_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawn (test_spawn.TestSpawnSingleSelf.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 63-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 64-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 62-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_68_860188938_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testCommSpawn (test_spawn.TestSpawnSingleSelf.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 67-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 67-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 67-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 67-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_67_406973651_virt32a [proxy:0@virt32a] got pmi command from downstream 67-0: cmd=get kvsname=kvs_27847_67_406973651_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 67-0: cmd=get kvsname=kvs_27847_67_406973651_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_67_406973651_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 67] got PMI command: cmd=get kvsname=kvs_27847_67_406973651_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:67:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 65-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 67-0: cmd=put kvsname=kvs_27847_67_406973651_virt32a key=P0-businesscard value=description#virt32a$port#34289$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#34289$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 67-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#34289$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#34289$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 67] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#34289$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 67] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:67:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#34289$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:67:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 61-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 65-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 65-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 65-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_65_1040967350_virt32a [proxy:0@virt32a] got pmi command from downstream 67-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 67] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:67:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 67-0: cmd=get kvsname=kvs_27847_67_406973651_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 65-0: cmd=get kvsname=kvs_27847_65_1040967350_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 67-0: cmd=get kvsname=kvs_27847_67_406973651_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34289$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 65-0: cmd=get kvsname=kvs_27847_65_1040967350_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_65_1040967350_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 65] got PMI command: cmd=get kvsname=kvs_27847_65_1040967350_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:65:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 65-0: cmd=put kvsname=kvs_27847_65_1040967350_virt32a key=P0-businesscard value=description#virt32a$port#47579$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#47579$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 65-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#47579$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#47579$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 65] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#47579$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 65] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:65:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47579$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:65:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 65-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 66-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 65] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:65:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 65-0: cmd=get kvsname=kvs_27847_65_1040967350_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 66-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 66-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 68-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 68-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 68-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 68-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_68_860188938_virt32a [proxy:0@virt32a] got pmi command from downstream 66-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_66_820148707_virt32a [proxy:0@virt32a] got pmi command from downstream 68-0: cmd=get kvsname=kvs_27847_68_860188938_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 66-0: cmd=get kvsname=kvs_27847_66_820148707_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 68-0: cmd=get kvsname=kvs_27847_68_860188938_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_68_860188938_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 68] got PMI command: cmd=get kvsname=kvs_27847_68_860188938_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:68:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 66-0: cmd=get kvsname=kvs_27847_66_820148707_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_66_820148707_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 65-0: cmd=get kvsname=kvs_27847_65_1040967350_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47579$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 66] got PMI command: cmd=get kvsname=kvs_27847_66_820148707_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:66:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 68-0: cmd=put kvsname=kvs_27847_68_860188938_virt32a key=P0-businesscard value=description#virt32a$port#36823$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#36823$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 68-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#36823$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#36823$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 68] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#36823$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 68] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:68:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#36823$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:68:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 68-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 68] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:68:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 68-0: cmd=get kvsname=kvs_27847_68_860188938_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 68-0: cmd=get kvsname=kvs_27847_68_860188938_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36823$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 66-0: cmd=put kvsname=kvs_27847_66_820148707_virt32a key=P0-businesscard value=description#virt32a$port#35175$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#35175$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 66-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#35175$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#35175$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 66] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#35175$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 66] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:66:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#35175$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:66:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 66-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 66] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:66:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 66-0: cmd=get kvsname=kvs_27847_66_820148707_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_69_1993659180_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnSingleSelf.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 66-0: cmd=get kvsname=kvs_27847_66_820148707_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35175$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 67-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 69-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_70_1726114717_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnSingleSelf.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 69-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_71_249973345_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnSingleSelf.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 69-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 69-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_69_1993659180_virt32a [proxy:0@virt32a] got pmi command from downstream 68-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 69-0: cmd=get kvsname=kvs_27847_69_1993659180_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 69-0: cmd=get kvsname=kvs_27847_69_1993659180_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_69_1993659180_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 69] got PMI command: cmd=get kvsname=kvs_27847_69_1993659180_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:69:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 69-0: cmd=put kvsname=kvs_27847_69_1993659180_virt32a key=P0-businesscard value=description#virt32a$port#48521$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#48521$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 69-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#48521$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#48521$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 69] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#48521$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 69] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:69:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#48521$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:69:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 65-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 69-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 69] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:69:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 70-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 69-0: cmd=get kvsname=kvs_27847_69_1993659180_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 70-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 69-0: cmd=get kvsname=kvs_27847_69_1993659180_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48521$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 70-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 70-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_70_1726114717_virt32a [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_72_271590561_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnSingleSelf.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 70-0: cmd=get kvsname=kvs_27847_70_1726114717_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 66-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 70-0: cmd=get kvsname=kvs_27847_70_1726114717_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_70_1726114717_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 70] got PMI command: cmd=get kvsname=kvs_27847_70_1726114717_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:70:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 70-0: cmd=put kvsname=kvs_27847_70_1726114717_virt32a key=P0-businesscard value=description#virt32a$port#57893$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#57893$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 70-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#57893$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#57893$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 70] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#57893$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 70] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:70:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#57893$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:70:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 71-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 70-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 70] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:70:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-cuhhoe6m.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 71-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-cuhhoe6m.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_73_302558840_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-cuhhoe6m.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testNoArgs (test_spawn.TestSpawnSingleSelf.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 70-0: cmd=get kvsname=kvs_27847_70_1726114717_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 71-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 69-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 71-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_71_249973345_virt32a [proxy:0@virt32a] got pmi command from downstream 70-0: cmd=get kvsname=kvs_27847_70_1726114717_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#57893$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 72-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 71-0: cmd=get kvsname=kvs_27847_71_249973345_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 72-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 71-0: cmd=get kvsname=kvs_27847_71_249973345_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_71_249973345_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 72-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [mpiexec@virt32a] [pgid: 71] got PMI command: cmd=get kvsname=kvs_27847_71_249973345_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:71:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 71-0: cmd=put kvsname=kvs_27847_71_249973345_virt32a key=P0-businesscard value=description#virt32a$port#52859$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#52859$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 72-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_72_271590561_virt32a [proxy:0@virt32a] got pmi command from downstream 71-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#52859$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#52859$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 72-0: cmd=get kvsname=kvs_27847_72_271590561_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 71] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#52859$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 71] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:71:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#52859$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:71:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 71-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 72-0: cmd=get kvsname=kvs_27847_72_271590561_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_72_271590561_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 71] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:71:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 72] got PMI command: cmd=get kvsname=kvs_27847_72_271590561_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:72:0): cmd=get_result rc=1 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 71-0: cmd=get kvsname=kvs_27847_71_249973345_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 71-0: cmd=get kvsname=kvs_27847_71_249973345_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#52859$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 72-0: cmd=put kvsname=kvs_27847_72_271590561_virt32a key=P0-businesscard value=description#virt32a$port#56491$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#56491$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 72-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#56491$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#56491$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 72] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#56491$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 72] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:72:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#56491$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:72:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 72-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 72] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:72:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 72-0: cmd=get kvsname=kvs_27847_72_271590561_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 72-0: cmd=get kvsname=kvs_27847_72_271590561_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#56491$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 73-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-vm9dknkr.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-vm9dknkr.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_74_1381012701_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-vm9dknkr.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testNoArgs (test_spawn.TestSpawnSingleSelf.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 73-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 73-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 73-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_73_302558840_virt32a [proxy:0@virt32a] got pmi command from downstream 73-0: cmd=get kvsname=kvs_27847_73_302558840_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 73-0: cmd=get kvsname=kvs_27847_73_302558840_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_73_302558840_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 73] got PMI command: cmd=get kvsname=kvs_27847_73_302558840_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:73:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 73-0: cmd=put kvsname=kvs_27847_73_302558840_virt32a key=P0-businesscard value=description#virt32a$port#42847$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#42847$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 73-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#42847$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#42847$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 73] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#42847$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 73] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:73:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42847$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:73:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 73-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 73] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:73:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 73-0: cmd=get kvsname=kvs_27847_73_302558840_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 73-0: cmd=get kvsname=kvs_27847_73_302558840_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#42847$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnSingleSelf.testNoArgs) ... [proxy:0@virt32a] got pmi command from downstream 70-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-w0k461uq.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-w0k461uq.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_75_1052283188_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-w0k461uq.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 71-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_76_104447668_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleSelfMany.testArgsOnlyAtRoot) ... [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 73-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnSingleSelf.testNoArgs) ... [proxy:0@virt32a] got pmi command from downstream 74-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-gfx9ese4.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 74-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-gfx9ese4.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_77_1406739327_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-gfx9ese4.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 74-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 74-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_74_1381012701_virt32a [proxy:0@virt32a] got pmi command from downstream 74-0: cmd=get kvsname=kvs_27847_74_1381012701_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 74-0: cmd=get kvsname=kvs_27847_74_1381012701_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_74_1381012701_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 74] got PMI command: cmd=get kvsname=kvs_27847_74_1381012701_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:74:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 75-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 74-0: cmd=put kvsname=kvs_27847_74_1381012701_virt32a key=P0-businesscard value=description#virt32a$port#49037$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#49037$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 75-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 74-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#49037$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#49037$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 75-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 75-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_75_1052283188_virt32a [mpiexec@virt32a] [pgid: 74] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#49037$ifname#127.0.1.1$ [proxy:0@virt32a] got pmi command from downstream 72-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [mpiexec@virt32a] [pgid: 74] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:74:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#49037$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:74:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 75-0: cmd=get kvsname=kvs_27847_75_1052283188_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 75-0: cmd=get kvsname=kvs_27847_75_1052283188_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_75_1052283188_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 75] got PMI command: cmd=get kvsname=kvs_27847_75_1052283188_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:75:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 74-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 75-0: cmd=put kvsname=kvs_27847_75_1052283188_virt32a key=P0-businesscard value=description#virt32a$port#59799$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#59799$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [mpiexec@virt32a] [pgid: 74] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:74:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 74-0: cmd=get kvsname=kvs_27847_74_1381012701_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 75-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#59799$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#59799$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 74-0: cmd=get kvsname=kvs_27847_74_1381012701_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#49037$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 75] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#59799$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 75] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:75:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#59799$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:75:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 75-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 75] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:75:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 75-0: cmd=get kvsname=kvs_27847_75_1052283188_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_76_104447668_virt32a [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_76_104447668_virt32a [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 75-0: cmd=get kvsname=kvs_27847_75_1052283188_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#59799$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 77-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 77-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 77-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [mpiexec@virt32a] [pgid: 76] got PMI command: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:76:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 77-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_77_1406739327_virt32a [mpiexec@virt32a] [pgid: 76] got PMI command: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:76:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 77-0: cmd=get kvsname=kvs_27847_77_1406739327_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 77-0: cmd=get kvsname=kvs_27847_77_1406739327_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_77_1406739327_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 77] got PMI command: cmd=get kvsname=kvs_27847_77_1406739327_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:77:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=put kvsname=kvs_27847_76_104447668_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7057476C694A7900 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7057476C694A7900 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_76_104447668_virt32a [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_76_104447668_virt32a [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 77-0: cmd=put kvsname=kvs_27847_77_1406739327_virt32a key=P0-businesscard value=description#virt32a$port#44367$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#44367$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 77-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#44367$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#44367$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 76] got PMI command: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:76:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 77] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#44367$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 77] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:77:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#44367$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:77:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 76] got PMI command: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:76:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 77-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 77] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:77:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=barrier_in [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7057476C694A7900 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7057476C694A7900 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 76] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7057476C694A7900 [mpiexec@virt32a] [pgid: 76] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:76:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7057476C694A7900 [mpiexec@virt32a] Sending internal PMI command (proxy:76:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 77-0: cmd=get kvsname=kvs_27847_77_1406739327_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7057476C694A7900 found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7057476C694A7900 found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7057476C694A7900 found=TRUE [proxy:0@virt32a] got pmi command from downstream 77-0: cmd=get kvsname=kvs_27847_77_1406739327_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#44367$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_78_1193600950_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 testArgsOnlyAtRoot (test_spawn.TestSpawnSingleSelfMany.testArgsOnlyAtRoot) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=put kvsname=kvs_27847_76_104447668_virt32a key=P2-businesscard value=description#virt32a$port#42219$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#42219$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 74-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=put kvsname=kvs_27847_76_104447668_virt32a key=P1-businesscard value=description#virt32a$port#33093$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#33093$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=put kvsname=kvs_27847_76_104447668_virt32a key=P3-businesscard value=description#virt32a$port#40497$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#40497$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=put kvsname=kvs_27847_76_104447668_virt32a key=P0-businesscard value=description#virt32a$port#40227$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#40227$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#42219$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33093$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#40497$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#40227$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#42219$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33093$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#40497$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#40227$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 76] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#42219$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33093$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#40497$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#40227$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 76] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:76:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#42219$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33093$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#40497$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#40227$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:76:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_79_795346141_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=barrier_in [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=barrier_in ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleSelfMany.testArgsOnlyAtRoot) ... [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_80_234309208_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleSelfMany.testArgsOnlyAtRoot) ... [mpiexec@virt32a] [pgid: 76] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:76:0): cmd=barrier_out [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_78_1193600950_virt32a [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 78] got PMI command: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:78:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 75-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#40227$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#33093$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#42219$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=get kvsname=kvs_27847_76_104447668_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#40497$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 77-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_78_1193600950_virt32a [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_79_795346141_virt32a [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 78] got PMI command: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:78:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 79] got PMI command: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:79:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_78_1193600950_virt32a [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_78_1193600950_virt32a [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 78] got PMI command: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:78:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_79_795346141_virt32a [mpiexec@virt32a] [pgid: 78] got PMI command: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:78:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=put kvsname=kvs_27847_78_1193600950_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D706D504456427100 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706D504456427100 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [mpiexec@virt32a] [pgid: 79] got PMI command: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:79:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706D504456427100 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706D504456427100 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 78] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706D504456427100 [mpiexec@virt32a] [pgid: 78] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:78:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706D504456427100 [mpiexec@virt32a] Sending internal PMI command (proxy:78:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706D504456427100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706D504456427100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_79_795346141_virt32a [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706D504456427100 found=TRUE [mpiexec@virt32a] [pgid: 79] got PMI command: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:79:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_79_795346141_virt32a [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 79] got PMI command: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:79:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=put kvsname=kvs_27847_79_795346141_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7038375372684E00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7038375372684E00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7038375372684E00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7038375372684E00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_80_234309208_virt32a [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 79] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7038375372684E00 [mpiexec@virt32a] [pgid: 79] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:79:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7038375372684E00 [mpiexec@virt32a] Sending internal PMI command (proxy:79:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7038375372684E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7038375372684E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7038375372684E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_80_234309208_virt32a [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_80_234309208_virt32a [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_80_234309208_virt32a [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 80] got PMI command: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:80:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 80] got PMI command: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:80:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 80] got PMI command: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:80:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 80] got PMI command: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:80:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=put kvsname=kvs_27847_80_234309208_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70717949496C6C00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70717949496C6C00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70717949496C6C00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70717949496C6C00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 80] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70717949496C6C00 [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=put kvsname=kvs_27847_78_1193600950_virt32a key=P0-businesscard value=description#virt32a$port#38651$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#38651$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [mpiexec@virt32a] [pgid: 80] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:80:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70717949496C6C00 [mpiexec@virt32a] Sending internal PMI command (proxy:80:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=put kvsname=kvs_27847_78_1193600950_virt32a key=P3-businesscard value=description#virt32a$port#39555$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#39555$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70717949496C6C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70717949496C6C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70717949496C6C00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=put kvsname=kvs_27847_78_1193600950_virt32a key=P1-businesscard value=description#virt32a$port#53657$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#53657$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=put kvsname=kvs_27847_78_1193600950_virt32a key=P2-businesscard value=description#virt32a$port#55453$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#55453$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#38651$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#39555$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#53657$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#55453$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#38651$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#39555$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#53657$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#55453$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 78] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#38651$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#39555$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#53657$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#55453$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 78] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:78:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#38651$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#39555$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#53657$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#55453$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:78:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=put kvsname=kvs_27847_79_795346141_virt32a key=P1-businesscard value=description#virt32a$port#46417$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#46417$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=put kvsname=kvs_27847_79_795346141_virt32a key=P0-businesscard value=description#virt32a$port#56991$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#56991$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_81_1610223540_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=put kvsname=kvs_27847_79_795346141_virt32a key=P2-businesscard value=description#virt32a$port#54479$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#54479$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=barrier_in ok testCommSpawn (test_spawn.TestSpawnSingleSelfMany.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=put kvsname=kvs_27847_79_795346141_virt32a key=P3-businesscard value=description#virt32a$port#43327$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#43327$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#46417$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#56991$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#54479$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#43327$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#46417$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#56991$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#54479$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#43327$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 78] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:78:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=put kvsname=kvs_27847_80_234309208_virt32a key=P3-businesscard value=description#virt32a$port#50557$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#50557$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [mpiexec@virt32a] [pgid: 79] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#46417$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#56991$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#54479$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#43327$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 79] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:79:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#46417$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#56991$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#54479$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#43327$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:79:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=put kvsname=kvs_27847_80_234309208_virt32a key=P0-businesscard value=description#virt32a$port#38983$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#38983$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=put kvsname=kvs_27847_80_234309208_virt32a key=P2-businesscard value=description#virt32a$port#60607$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#60607$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=put kvsname=kvs_27847_80_234309208_virt32a key=P1-businesscard value=description#virt32a$port#52701$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#52701$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P3-businesscard=description#virt32a$port#50557$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38983$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#60607$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#52701$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P3-businesscard=description#virt32a$port#50557$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38983$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#60607$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#52701$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 80] got PMI command: cmd=mput P3-businesscard=description#virt32a$port#50557$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38983$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#60607$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#52701$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 80] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:80:0): cmd=keyval_cache P3-businesscard=description#virt32a$port#50557$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38983$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#60607$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#52701$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:80:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#38651$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#53657$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#55453$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=get kvsname=kvs_27847_78_1193600950_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#39555$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 79] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:79:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 76-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 76-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 76-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 80] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:80:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 76-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#56991$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#46417$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#38983$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#54479$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=get kvsname=kvs_27847_79_795346141_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#43327$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#52701$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#60607$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=get kvsname=kvs_27847_80_234309208_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#50557$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_81_1610223540_virt32a [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_81_1610223540_virt32a [mpiexec@virt32a] [pgid: 81] got PMI command: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:81:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 81] got PMI command: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:81:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=put kvsname=kvs_27847_81_1610223540_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7079355673365900 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7079355673365900 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_81_1610223540_virt32a [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_81_1610223540_virt32a [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 81] got PMI command: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:81:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 81] got PMI command: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:81:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7079355673365900 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7079355673365900 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 81] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7079355673365900 [mpiexec@virt32a] [pgid: 81] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:81:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7079355673365900 [mpiexec@virt32a] Sending internal PMI command (proxy:81:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7079355673365900 found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7079355673365900 found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7079355673365900 found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=put kvsname=kvs_27847_81_1610223540_virt32a key=P3-businesscard value=description#virt32a$port#49791$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#49791$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=put kvsname=kvs_27847_81_1610223540_virt32a key=P2-businesscard value=description#virt32a$port#34641$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#34641$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=put kvsname=kvs_27847_81_1610223540_virt32a key=P0-businesscard value=description#virt32a$port#44797$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#44797$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=put kvsname=kvs_27847_81_1610223540_virt32a key=P1-businesscard value=description#virt32a$port#40179$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#40179$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P3-businesscard=description#virt32a$port#49791$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#34641$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#44797$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#40179$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P3-businesscard=description#virt32a$port#49791$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#34641$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#44797$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#40179$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 81] got PMI command: cmd=mput P3-businesscard=description#virt32a$port#49791$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#34641$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#44797$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#40179$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 81] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:81:0): cmd=keyval_cache P3-businesscard=description#virt32a$port#49791$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#34641$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#44797$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#40179$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:81:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_82_217315141_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST ok testCommSpawn (test_spawn.TestSpawnSingleSelfMany.testCommSpawn) ... [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_83_1523310330_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=barrier_in ok testCommSpawn (test_spawn.TestSpawnSingleSelfMany.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 81] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:81:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_84_2088744660_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE ok testCommSpawn (test_spawn.TestSpawnSingleSelfMany.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 78-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#44797$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#40179$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 78-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 79-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34641$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=get kvsname=kvs_27847_81_1610223540_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#49791$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 79-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 80-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 80-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 80-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 80-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_82_217315141_virt32a [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_82_217315141_virt32a [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_84_2088744660_virt32a [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 82] got PMI command: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:82:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 82] got PMI command: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:82:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_84_2088744660_virt32a [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_84_2088744660_virt32a [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=put kvsname=kvs_27847_82_217315141_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70583469534D6A00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70583469534D6A00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [mpiexec@virt32a] [pgid: 84] got PMI command: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:84:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 84] got PMI command: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:84:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 84] got PMI command: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:84:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=barrier_in [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_82_217315141_virt32a [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_82_217315141_virt32a [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 82] got PMI command: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:82:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 82] got PMI command: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:82:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=barrier_in [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_83_1523310330_virt32a [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_83_1523310330_virt32a [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 83] got PMI command: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:83:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_83_1523310330_virt32a [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=barrier_in [mpiexec@virt32a] [pgid: 83] got PMI command: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:83:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 83] got PMI command: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:83:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_83_1523310330_virt32a [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70583469534D6A00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70583469534D6A00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_84_2088744660_virt32a [mpiexec@virt32a] [pgid: 82] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70583469534D6A00 [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [mpiexec@virt32a] [pgid: 82] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:82:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70583469534D6A00 [mpiexec@virt32a] Sending internal PMI command (proxy:82:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70583469534D6A00 found=TRUE [mpiexec@virt32a] [pgid: 83] got PMI command: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:83:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70583469534D6A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70583469534D6A00 found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=barrier_in [mpiexec@virt32a] [pgid: 84] got PMI command: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:84:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=put kvsname=kvs_27847_84_2088744660_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D707534476F4A5400 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707534476F4A5400 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=put kvsname=kvs_27847_83_1523310330_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D704C72324C4B6E00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704C72324C4B6E00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704C72324C4B6E00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704C72324C4B6E00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707534476F4A5400 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707534476F4A5400 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 83] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704C72324C4B6E00 [mpiexec@virt32a] [pgid: 83] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:83:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704C72324C4B6E00 [mpiexec@virt32a] Sending internal PMI command (proxy:83:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 84] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707534476F4A5400 [mpiexec@virt32a] [pgid: 84] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:84:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D707534476F4A5400 [mpiexec@virt32a] Sending internal PMI command (proxy:84:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704C72324C4B6E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704C72324C4B6E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704C72324C4B6E00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D707534476F4A5400 found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D707534476F4A5400 found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D707534476F4A5400 found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=put kvsname=kvs_27847_82_217315141_virt32a key=P1-businesscard value=description#virt32a$port#60073$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#60073$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=put kvsname=kvs_27847_82_217315141_virt32a key=P2-businesscard value=description#virt32a$port#57467$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#57467$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=put kvsname=kvs_27847_82_217315141_virt32a key=P3-businesscard value=description#virt32a$port#41693$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#41693$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=put kvsname=kvs_27847_82_217315141_virt32a key=P0-businesscard value=description#virt32a$port#33817$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#33817$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#60073$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#57467$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#41693$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#33817$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#60073$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#57467$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#41693$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#33817$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 82] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#60073$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#57467$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#41693$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#33817$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 82] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:82:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#60073$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#57467$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#41693$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#33817$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:82:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_85_1974532094_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnSingleSelfMany.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=put kvsname=kvs_27847_83_1523310330_virt32a key=P0-businesscard value=description#virt32a$port#49681$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#49681$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=put kvsname=kvs_27847_83_1523310330_virt32a key=P3-businesscard value=description#virt32a$port#55371$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#55371$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=put kvsname=kvs_27847_84_2088744660_virt32a key=P1-businesscard value=description#virt32a$port#56327$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#56327$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=put kvsname=kvs_27847_84_2088744660_virt32a key=P3-businesscard value=description#virt32a$port#54107$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#54107$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=put kvsname=kvs_27847_83_1523310330_virt32a key=P1-businesscard value=description#virt32a$port#53007$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#53007$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=put kvsname=kvs_27847_84_2088744660_virt32a key=P2-businesscard value=description#virt32a$port#56377$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#56377$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=put kvsname=kvs_27847_83_1523310330_virt32a key=P2-businesscard value=description#virt32a$port#59817$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#59817$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=put kvsname=kvs_27847_84_2088744660_virt32a key=P0-businesscard value=description#virt32a$port#47905$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#47905$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#49681$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#55371$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#53007$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#59817$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#49681$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#55371$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#53007$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#59817$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#56327$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#54107$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#56377$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47905$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#56327$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#54107$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#56377$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47905$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 83] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#49681$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#55371$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#53007$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#59817$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 83] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:83:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#49681$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#55371$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#53007$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#59817$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:83:0): cmd=barrier_out [mpiexec@virt32a] [pgid: 84] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#56327$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#54107$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#56377$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47905$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 84] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:84:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#56327$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#54107$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#56377$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47905$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:84:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [mpiexec@virt32a] [pgid: 82] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:82:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 81-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 81-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 84] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:84:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#33817$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 83] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:83:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#60073$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#57467$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=get kvsname=kvs_27847_82_217315141_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#41693$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47905$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#49681$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#56327$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#53007$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#56377$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#59817$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=get kvsname=kvs_27847_84_2088744660_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#54107$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=get kvsname=kvs_27847_83_1523310330_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#55371$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_85_1974532094_virt32a [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 85] got PMI command: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:85:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_85_1974532094_virt32a [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_85_1974532094_virt32a [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_85_1974532094_virt32a [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 85] got PMI command: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:85:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 85] got PMI command: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:85:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 85] got PMI command: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:85:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=put kvsname=kvs_27847_85_1974532094_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70744B584B793300 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70744B584B793300 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70744B584B793300 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70744B584B793300 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 85] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70744B584B793300 [mpiexec@virt32a] [pgid: 85] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:85:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70744B584B793300 [mpiexec@virt32a] Sending internal PMI command (proxy:85:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70744B584B793300 found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70744B584B793300 found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70744B584B793300 found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_86_57696024_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=put kvsname=kvs_27847_85_1974532094_virt32a key=P1-businesscard value=description#virt32a$port#42971$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#42971$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=put kvsname=kvs_27847_85_1974532094_virt32a key=P2-businesscard value=description#virt32a$port#46815$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#46815$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 ok testErrcodes (test_spawn.TestSpawnSingleSelfMany.testErrcodes) ... [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=put kvsname=kvs_27847_85_1974532094_virt32a key=P0-businesscard value=description#virt32a$port#38181$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#38181$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=put kvsname=kvs_27847_85_1974532094_virt32a key=P3-businesscard value=description#virt32a$port#59127$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#59127$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#42971$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#46815$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38181$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#59127$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#42971$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#46815$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38181$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#59127$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 85] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#42971$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#46815$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38181$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#59127$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 85] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:85:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#42971$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#46815$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38181$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#59127$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:85:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 82-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 85] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:85:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 82-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 82-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_87_211219860_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnSingleSelfMany.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 82-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_88_1713582337_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnSingleSelfMany.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#38181$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 84-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#42971$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#46815$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=get kvsname=kvs_27847_85_1974532094_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#59127$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 84-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 83-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_86_57696024_virt32a [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 83-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 83-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 86] got PMI command: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:86:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_86_57696024_virt32a [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 83-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_86_57696024_virt32a [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 86] got PMI command: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:86:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 86] got PMI command: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:86:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_86_57696024_virt32a [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 86] got PMI command: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:86:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=put kvsname=kvs_27847_86_57696024_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7064457350575700 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7064457350575700 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7064457350575700 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7064457350575700 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 86] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7064457350575700 [mpiexec@virt32a] [pgid: 86] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:86:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7064457350575700 [mpiexec@virt32a] Sending internal PMI command (proxy:86:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7064457350575700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7064457350575700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7064457350575700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_87_211219860_virt32a [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_87_211219860_virt32a [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_87_211219860_virt32a [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_87_211219860_virt32a [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 87] got PMI command: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:87:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_88_1713582337_virt32a [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 87] got PMI command: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:87:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_88_1713582337_virt32a [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 88] got PMI command: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:88:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [mpiexec@virt32a] [pgid: 87] got PMI command: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:87:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 87] got PMI command: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:87:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 88] got PMI command: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:88:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_88_1713582337_virt32a [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_88_1713582337_virt32a [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=put kvsname=kvs_27847_88_1713582337_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70377A7049646700 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70377A7049646700 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=put kvsname=kvs_27847_87_211219860_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D704A3832734B4700 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704A3832734B4700 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [mpiexec@virt32a] [pgid: 88] got PMI command: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:88:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704A3832734B4700 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704A3832734B4700 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 88] got PMI command: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:88:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 87] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704A3832734B4700 [mpiexec@virt32a] [pgid: 87] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:87:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704A3832734B4700 [mpiexec@virt32a] Sending internal PMI command (proxy:87:0): cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=put kvsname=kvs_27847_86_57696024_virt32a key=P3-businesscard value=description#virt32a$port#57597$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#57597$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=put kvsname=kvs_27847_86_57696024_virt32a key=P1-businesscard value=description#virt32a$port#45497$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#45497$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704A3832734B4700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704A3832734B4700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704A3832734B4700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=put kvsname=kvs_27847_86_57696024_virt32a key=P0-businesscard value=description#virt32a$port#34241$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#34241$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70377A7049646700 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70377A7049646700 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 88] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70377A7049646700 [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=barrier_in [mpiexec@virt32a] [pgid: 88] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:88:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70377A7049646700 [mpiexec@virt32a] Sending internal PMI command (proxy:88:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70377A7049646700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70377A7049646700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70377A7049646700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=put kvsname=kvs_27847_86_57696024_virt32a key=P2-businesscard value=description#virt32a$port#52543$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#52543$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P3-businesscard=description#virt32a$port#57597$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45497$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#34241$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#52543$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P3-businesscard=description#virt32a$port#57597$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45497$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#34241$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#52543$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 86] got PMI command: cmd=mput P3-businesscard=description#virt32a$port#57597$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45497$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#34241$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#52543$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 86] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:86:0): cmd=keyval_cache P3-businesscard=description#virt32a$port#57597$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#45497$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#34241$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#52543$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:86:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=barrier_in [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/tmp/mpi4py-tsw3pqnu.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/tmp/mpi4py-tsw3pqnu.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_89_1769127792_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-tsw3pqnu.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=barrier_in ok testNoArgs (test_spawn.TestSpawnSingleSelfMany.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [mpiexec@virt32a] [pgid: 86] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:86:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=put kvsname=kvs_27847_87_211219860_virt32a key=P0-businesscard value=description#virt32a$port#43959$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#43959$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=put kvsname=kvs_27847_87_211219860_virt32a key=P1-businesscard value=description#virt32a$port#48795$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#48795$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=put kvsname=kvs_27847_87_211219860_virt32a key=P2-businesscard value=description#virt32a$port#36327$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#36327$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=put kvsname=kvs_27847_87_211219860_virt32a key=P3-businesscard value=description#virt32a$port#45271$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#45271$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#43959$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#48795$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#36327$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#45271$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#43959$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#48795$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#36327$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#45271$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 87] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#43959$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#48795$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#36327$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#45271$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 87] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:87:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#43959$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#48795$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#36327$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#45271$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:87:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=put kvsname=kvs_27847_88_1713582337_virt32a key=P1-businesscard value=description#virt32a$port#44545$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#44545$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=put kvsname=kvs_27847_88_1713582337_virt32a key=P2-businesscard value=description#virt32a$port#43517$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#43517$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=barrier_in [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=put kvsname=kvs_27847_88_1713582337_virt32a key=P3-businesscard value=description#virt32a$port#35061$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#35061$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=put kvsname=kvs_27847_88_1713582337_virt32a key=P0-businesscard value=description#virt32a$port#39711$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#39711$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#44545$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#43517$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#35061$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#39711$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#44545$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#43517$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#35061$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#39711$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 88] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#44545$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#43517$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#35061$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#39711$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 88] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:88:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#44545$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#43517$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#35061$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#39711$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:88:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 85-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 85-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 85-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34241$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 85-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45497$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#52543$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 88] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:88:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=get kvsname=kvs_27847_86_57696024_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#57597$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [mpiexec@virt32a] [pgid: 87] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:87:0): cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_89_1769127792_virt32a [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_89_1769127792_virt32a [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 89] got PMI command: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:89:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 89] got PMI command: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:89:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#43959$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#48795$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=put kvsname=kvs_27847_89_1769127792_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D706F556E79554F00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F556E79554F00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36327$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=get kvsname=kvs_27847_87_211219860_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45271$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#39711$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#44545$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#43517$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=get kvsname=kvs_27847_88_1713582337_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35061$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_89_1769127792_virt32a [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_89_1769127792_virt32a [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 89] got PMI command: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:89:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 89] got PMI command: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:89:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F556E79554F00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F556E79554F00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 89] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F556E79554F00 [mpiexec@virt32a] [pgid: 89] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:89:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706F556E79554F00 [mpiexec@virt32a] Sending internal PMI command (proxy:89:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706F556E79554F00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706F556E79554F00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706F556E79554F00 found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/tmp/mpi4py-ygzyrisn.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/tmp/mpi4py-ygzyrisn.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_90_41337916_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-ygzyrisn.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnSingleSelfMany.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=put kvsname=kvs_27847_89_1769127792_virt32a key=P1-businesscard value=description#virt32a$port#34669$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#34669$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=put kvsname=kvs_27847_89_1769127792_virt32a key=P0-businesscard value=description#virt32a$port#42223$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#42223$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=put kvsname=kvs_27847_89_1769127792_virt32a key=P2-businesscard value=description#virt32a$port#53577$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#53577$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=put kvsname=kvs_27847_89_1769127792_virt32a key=P3-businesscard value=description#virt32a$port#47047$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#47047$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#34669$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42223$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#53577$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#47047$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#34669$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42223$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#53577$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#47047$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 89] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#34669$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42223$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#53577$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#47047$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 89] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:89:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#34669$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#42223$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#53577$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#47047$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:89:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/tmp/mpi4py-sihzq_hh.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/tmp/mpi4py-sihzq_hh.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_91_2088461145_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-sihzq_hh.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testNoArgs (test_spawn.TestSpawnSingleSelfMany.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 89] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:89:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 86-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 86-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 86-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 86-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_91_2088461145_virt32a [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 91] got PMI command: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:91:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 87-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 87-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/tmp/mpi4py-jtkytisi.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/tmp/mpi4py-jtkytisi.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_92_786676051_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-jtkytisi.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testNoArgs (test_spawn.TestSpawnSingleSelfMany.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 87-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 87-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#42223$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34669$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#53577$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=get kvsname=kvs_27847_89_1769127792_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47047$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 88-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 88-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 88-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_90_41337916_virt32a [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_90_41337916_virt32a [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 88-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [mpiexec@virt32a] [pgid: 90] got PMI command: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:90:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 90] got PMI command: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:90:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_90_41337916_virt32a [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_90_41337916_virt32a [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [mpiexec@virt32a] [pgid: 90] got PMI command: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:90:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 90] got PMI command: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:90:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=put kvsname=kvs_27847_90_41337916_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D706C69476F4B7100 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706C69476F4B7100 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706C69476F4B7100 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706C69476F4B7100 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [mpiexec@virt32a] [pgid: 90] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706C69476F4B7100 [mpiexec@virt32a] [pgid: 90] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:90:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D706C69476F4B7100 [mpiexec@virt32a] Sending internal PMI command (proxy:90:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706C69476F4B7100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706C69476F4B7100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D706C69476F4B7100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_91_2088461145_virt32a [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_91_2088461145_virt32a [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_91_2088461145_virt32a [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 91] got PMI command: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:91:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 91] got PMI command: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:91:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 91] got PMI command: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:91:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=put kvsname=kvs_27847_91_2088461145_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D704B474233494100 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704B474233494100 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_92_786676051_virt32a [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_92_786676051_virt32a [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=barrier_in [mpiexec@virt32a] [pgid: 92] got PMI command: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:92:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 92] got PMI command: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:92:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=put kvsname=kvs_27847_92_786676051_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70354552714E6A00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70354552714E6A00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704B474233494100 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704B474233494100 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 91] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704B474233494100 [mpiexec@virt32a] [pgid: 91] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:91:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704B474233494100 [mpiexec@virt32a] Sending internal PMI command (proxy:91:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704B474233494100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704B474233494100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704B474233494100 found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_92_786676051_virt32a [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 92] got PMI command: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:92:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_92_786676051_virt32a [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=put kvsname=kvs_27847_90_41337916_virt32a key=P0-businesscard value=description#virt32a$port#51585$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#51585$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=put kvsname=kvs_27847_90_41337916_virt32a key=P1-businesscard value=description#virt32a$port#43069$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#43069$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=put kvsname=kvs_27847_90_41337916_virt32a key=P2-businesscard value=description#virt32a$port#54959$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#54959$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=put kvsname=kvs_27847_90_41337916_virt32a key=P3-businesscard value=description#virt32a$port#54101$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#54101$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#51585$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#43069$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#54959$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#54101$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#51585$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#43069$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#54959$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#54101$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 92] got PMI command: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:92:0): cmd=get_result rc=1 [mpiexec@virt32a] [pgid: 90] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#51585$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#43069$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#54959$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#54101$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 90] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:90:0): cmd=keyval_cache P0-businesscard=description#virt32a$port#51585$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#43069$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#54959$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#54101$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:90:0): cmd=barrier_out [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=put kvsname=kvs_27847_91_2088461145_virt32a key=P2-businesscard value=description#virt32a$port#37775$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#37775$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=put kvsname=kvs_27847_91_2088461145_virt32a key=P3-businesscard value=description#virt32a$port#43257$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#43257$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70354552714E6A00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70354552714E6A00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=put kvsname=kvs_27847_91_2088461145_virt32a key=P0-businesscard value=description#virt32a$port#59695$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#59695$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=put kvsname=kvs_27847_91_2088461145_virt32a key=P1-businesscard value=description#virt32a$port#55915$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#55915$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [mpiexec@virt32a] [pgid: 92] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70354552714E6A00 [mpiexec@virt32a] [pgid: 92] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:92:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70354552714E6A00 [mpiexec@virt32a] Sending internal PMI command (proxy:92:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#37775$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#43257$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#59695$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#55915$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#37775$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#43257$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#59695$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#55915$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70354552714E6A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70354552714E6A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70354552714E6A00 found=TRUE [mpiexec@virt32a] [pgid: 91] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#37775$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#43257$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#59695$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#55915$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 91] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:91:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#37775$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#43257$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#59695$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#55915$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:91:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleWorld.testArgsOnlyAtRoot) ... [mpiexec@virt32a] [pgid: 90] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:90:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 91] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:91:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=put kvsname=kvs_27847_92_786676051_virt32a key=P3-businesscard value=description#virt32a$port#33491$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#33491$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=put kvsname=kvs_27847_92_786676051_virt32a key=P0-businesscard value=description#virt32a$port#53581$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#53581$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=put kvsname=kvs_27847_92_786676051_virt32a key=P2-businesscard value=description#virt32a$port#60281$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#60281$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=put kvsname=kvs_27847_92_786676051_virt32a key=P1-businesscard value=description#virt32a$port#37353$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#37353$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P3-businesscard=description#virt32a$port#33491$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#53581$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#60281$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#37353$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P3-businesscard=description#virt32a$port#33491$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#53581$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#60281$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#37353$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 92] got PMI command: cmd=mput P3-businesscard=description#virt32a$port#33491$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#53581$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#60281$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#37353$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 92] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:92:0): cmd=keyval_cache P3-businesscard=description#virt32a$port#33491$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#53581$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#60281$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#37353$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:92:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#51585$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#43069$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 89-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#54959$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=get kvsname=kvs_27847_90_41337916_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#54101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#59695$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 89-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#55915$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#37775$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=get kvsname=kvs_27847_91_2088461145_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#43257$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 92] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:92:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#53581$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#37353$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#60281$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=get kvsname=kvs_27847_92_786676051_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#33491$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleWorld.testArgsOnlyAtRoot) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleWorld.testArgsOnlyAtRoot) ... [proxy:0@virt32a] got pmi command from downstream 90-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 90-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 90-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 90-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 91-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 91-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 91-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 91-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleWorld.testArgsOnlyAtRoot) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_93_1868599584_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 92-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 92-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 92-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 92-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 93-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 93-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 93-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 93-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_93_1868599584_virt32a [proxy:0@virt32a] got pmi command from downstream 93-0: cmd=get kvsname=kvs_27847_93_1868599584_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 93-0: cmd=get kvsname=kvs_27847_93_1868599584_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_93_1868599584_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 93] got PMI command: cmd=get kvsname=kvs_27847_93_1868599584_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:93:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 93-0: cmd=put kvsname=kvs_27847_93_1868599584_virt32a key=P0-businesscard value=description#virt32a$port#47549$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#47549$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 93-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#47549$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#47549$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 93] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#47549$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 93] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:93:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#47549$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:93:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 93-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 93] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:93:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 93-0: cmd=get kvsname=kvs_27847_93_1868599584_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 93-0: cmd=get kvsname=kvs_27847_93_1868599584_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47549$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_94_251213381_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] got pmi command from downstream 93-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 ok testCommSpawn (test_spawn.TestSpawnSingleWorld.testCommSpawn) ... ok testCommSpawn (test_spawn.TestSpawnSingleWorld.testCommSpawn) ... ok testCommSpawn (test_spawn.TestSpawnSingleWorld.testCommSpawn) ... ok testCommSpawn (test_spawn.TestSpawnSingleWorld.testCommSpawn) ... [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 94-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 94-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 94-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 94-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_94_251213381_virt32a [proxy:0@virt32a] got pmi command from downstream 94-0: cmd=get kvsname=kvs_27847_94_251213381_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 94-0: cmd=get kvsname=kvs_27847_94_251213381_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_94_251213381_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 94] got PMI command: cmd=get kvsname=kvs_27847_94_251213381_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:94:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 94-0: cmd=put kvsname=kvs_27847_94_251213381_virt32a key=P0-businesscard value=description#virt32a$port#38621$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#38621$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 94-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#38621$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#38621$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 94] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#38621$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 94] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:94:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#38621$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:94:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 94-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 94] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:94:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 94-0: cmd=get kvsname=kvs_27847_94_251213381_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 94-0: cmd=get kvsname=kvs_27847_94_251213381_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#38621$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrcodes (test_spawn.TestSpawnSingleWorld.testErrcodes) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testErrcodes (test_spawn.TestSpawnSingleWorld.testErrcodes) ... [proxy:0@virt32a] got pmi command from downstream 94-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrcodes (test_spawn.TestSpawnSingleWorld.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrcodes (test_spawn.TestSpawnSingleWorld.testErrcodes) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_95_830730_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 95-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 95-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 95-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 95-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_95_830730_virt32a [proxy:0@virt32a] got pmi command from downstream 95-0: cmd=get kvsname=kvs_27847_95_830730_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 95-0: cmd=get kvsname=kvs_27847_95_830730_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_95_830730_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 95] got PMI command: cmd=get kvsname=kvs_27847_95_830730_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:95:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 95-0: cmd=put kvsname=kvs_27847_95_830730_virt32a key=P0-businesscard value=description#virt32a$port#34027$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#34027$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 95-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#34027$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#34027$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 95] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#34027$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 95] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:95:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#34027$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:95:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 95-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 95] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:95:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 95-0: cmd=get kvsname=kvs_27847_95_830730_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 95-0: cmd=get kvsname=kvs_27847_95_830730_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34027$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnSingleWorld.testNoArgs) ... ok testNoArgs (test_spawn.TestSpawnSingleWorld.testNoArgs) ... ok testNoArgs (test_spawn.TestSpawnSingleWorld.testNoArgs) ... ok testNoArgs (test_spawn.TestSpawnSingleWorld.testNoArgs) ... [proxy:0@virt32a] got pmi command from downstream 95-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=1 execname=/tmp/mpi4py-hw_8_v8r.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=1 execname=/tmp/mpi4py-hw_8_v8r.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 1 --auto-cleanup 1 --pmi-kvsname kvs_27847_96_762083287_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,0) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 1 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-hw_8_v8r.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 96-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 96-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 96-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 96-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_96_762083287_virt32a [proxy:0@virt32a] got pmi command from downstream 96-0: cmd=get kvsname=kvs_27847_96_762083287_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 96-0: cmd=get kvsname=kvs_27847_96_762083287_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_96_762083287_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 96] got PMI command: cmd=get kvsname=kvs_27847_96_762083287_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:96:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 96-0: cmd=put kvsname=kvs_27847_96_762083287_virt32a key=P0-businesscard value=description#virt32a$port#36503$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#36503$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 96-0: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P0-businesscard=description#virt32a$port#36503$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P0-businesscard=description#virt32a$port#36503$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 96] got PMI command: cmd=mput P0-businesscard=description#virt32a$port#36503$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 96] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:96:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#36503$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:96:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 96-0: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 96] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:96:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 96-0: cmd=get kvsname=kvs_27847_96_762083287_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 96-0: cmd=get kvsname=kvs_27847_96_762083287_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36503$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 96-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleWorldMany.testArgsOnlyAtRoot) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleWorldMany.testArgsOnlyAtRoot) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleWorldMany.testArgsOnlyAtRoot) ... ok testArgsOnlyAtRoot (test_spawn.TestSpawnSingleWorldMany.testArgsOnlyAtRoot) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_97_1071362088_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_97_1071362088_virt32a [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 97] got PMI command: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:97:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_97_1071362088_virt32a [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 97] got PMI command: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:97:0): cmd=get_result rc=1 [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_97_1071362088_virt32a [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 97] got PMI command: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:97:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=put kvsname=kvs_27847_97_1071362088_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7069794948683600 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7069794948683600 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_97_1071362088_virt32a [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 97] got PMI command: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:97:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7069794948683600 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7069794948683600 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 97] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7069794948683600 [mpiexec@virt32a] [pgid: 97] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:97:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7069794948683600 [mpiexec@virt32a] Sending internal PMI command (proxy:97:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7069794948683600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7069794948683600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7069794948683600 found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=put kvsname=kvs_27847_97_1071362088_virt32a key=P2-businesscard value=description#virt32a$port#36931$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#36931$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=put kvsname=kvs_27847_97_1071362088_virt32a key=P0-businesscard value=description#virt32a$port#45221$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#45221$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=put kvsname=kvs_27847_97_1071362088_virt32a key=P3-businesscard value=description#virt32a$port#41071$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#41071$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=put kvsname=kvs_27847_97_1071362088_virt32a key=P1-businesscard value=description#virt32a$port#33407$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#33407$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#36931$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#45221$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#41071$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33407$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#36931$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#45221$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#41071$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33407$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 97] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#36931$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#45221$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#41071$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33407$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 97] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:97:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#36931$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#45221$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#41071$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#33407$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:97:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 97] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:97:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45221$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#33407$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#36931$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=get kvsname=kvs_27847_97_1071362088_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#41071$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommSpawn (test_spawn.TestSpawnSingleWorldMany.testCommSpawn) ... ok testCommSpawn (test_spawn.TestSpawnSingleWorldMany.testCommSpawn) ... ok testCommSpawn (test_spawn.TestSpawnSingleWorldMany.testCommSpawn) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCommSpawn (test_spawn.TestSpawnSingleWorldMany.testCommSpawn) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_98_407804381_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 97-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 97-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 97-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 97-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_98_407804381_virt32a [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [mpiexec@virt32a] [pgid: 98] got PMI command: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:98:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=put kvsname=kvs_27847_98_407804381_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D70695371764A5700 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70695371764A5700 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_98_407804381_virt32a [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_98_407804381_virt32a [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 98] got PMI command: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:98:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_98_407804381_virt32a [mpiexec@virt32a] [pgid: 98] got PMI command: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:98:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 98] got PMI command: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:98:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70695371764A5700 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70695371764A5700 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 98] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70695371764A5700 [mpiexec@virt32a] [pgid: 98] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:98:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D70695371764A5700 [mpiexec@virt32a] Sending internal PMI command (proxy:98:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70695371764A5700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70695371764A5700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D70695371764A5700 found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=put kvsname=kvs_27847_98_407804381_virt32a key=P2-businesscard value=description#virt32a$port#40375$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#40375$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=put kvsname=kvs_27847_98_407804381_virt32a key=P3-businesscard value=description#virt32a$port#35337$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#35337$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=put kvsname=kvs_27847_98_407804381_virt32a key=P0-businesscard value=description#virt32a$port#45851$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#45851$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=put kvsname=kvs_27847_98_407804381_virt32a key=P1-businesscard value=description#virt32a$port#50659$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#50659$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#40375$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#35337$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#45851$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#50659$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#40375$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#35337$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#45851$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#50659$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 98] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#40375$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#35337$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#45851$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#50659$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 98] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:98:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#40375$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#35337$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#45851$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#50659$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:98:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 98] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:98:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45851$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#50659$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#40375$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=get kvsname=kvs_27847_98_407804381_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35337$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testErrcodes (test_spawn.TestSpawnSingleWorldMany.testErrcodes) ... ok testErrcodes (test_spawn.TestSpawnSingleWorldMany.testErrcodes) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/usr/bin/python3.12 totspawns=1 spawnssofar=1 argcnt=2 arg1=/build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py arg2=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_99_1622272225_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 3 /usr/bin/python3.12 /build/reproducible-path/mpi4py-4.0.0/test/spawn_child.py /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 ok testErrcodes (test_spawn.TestSpawnSingleWorldMany.testErrcodes) ... ok testErrcodes (test_spawn.TestSpawnSingleWorldMany.testErrcodes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 98-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 98-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 98-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 98-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_99_1622272225_virt32a [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 99] got PMI command: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:99:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=put kvsname=kvs_27847_99_1622272225_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D704E4E797A445A00 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704E4E797A445A00 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_99_1622272225_virt32a [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 99] got PMI command: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:99:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_99_1622272225_virt32a [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_99_1622272225_virt32a [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 99] got PMI command: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:99:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 99] got PMI command: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:99:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704E4E797A445A00 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704E4E797A445A00 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 99] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704E4E797A445A00 [mpiexec@virt32a] [pgid: 99] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:99:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D704E4E797A445A00 [mpiexec@virt32a] Sending internal PMI command (proxy:99:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704E4E797A445A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704E4E797A445A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D704E4E797A445A00 found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=put kvsname=kvs_27847_99_1622272225_virt32a key=P1-businesscard value=description#virt32a$port#35029$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#35029$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=put kvsname=kvs_27847_99_1622272225_virt32a key=P3-businesscard value=description#virt32a$port#53897$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#53897$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=put kvsname=kvs_27847_99_1622272225_virt32a key=P2-businesscard value=description#virt32a$port#45359$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#45359$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=put kvsname=kvs_27847_99_1622272225_virt32a key=P0-businesscard value=description#virt32a$port#46945$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#46945$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P1-businesscard=description#virt32a$port#35029$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#53897$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#45359$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#46945$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P1-businesscard=description#virt32a$port#35029$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#53897$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#45359$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#46945$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 99] got PMI command: cmd=mput P1-businesscard=description#virt32a$port#35029$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#53897$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#45359$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#46945$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 99] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:99:0): cmd=keyval_cache P1-businesscard=description#virt32a$port#35029$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#53897$ifname#127.0.1.1$ P2-businesscard=description#virt32a$port#45359$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#46945$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:99:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 99] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:99:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#46945$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35029$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45359$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=get kvsname=kvs_27847_99_1622272225_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#53897$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnSingleWorldMany.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnSingleWorldMany.testNoArgs) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNoArgs (test_spawn.TestSpawnSingleWorldMany.testNoArgs) ... ok testNoArgs (test_spawn.TestSpawnSingleWorldMany.testNoArgs) ... [proxy:0@virt32a] we don't understand this command, forwarding upstream [proxy:0@virt32a] mcmd=spawn nprocs=4 execname=/tmp/mpi4py-1y3r8a0z.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 0] got PMI command: mcmd=spawn nprocs=4 execname=/tmp/mpi4py-1y3r8a0z.py totspawns=1 spawnssofar=1 argcnt=0 preput_num=1 preput_key_0=PARENT_ROOT_PORT_NAME preput_val_0=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ info_num=0 endcmd Arguments being passed to proxy 0: --version 4.2.0 --iface-ip-env-name MPIR_CVAR_CH3_INTERFACE_HOSTNAME --hostname virt32a --global-core-map 0,1,1 --pmi-id-map 0,0 --global-process-count 4 --auto-cleanup 1 --pmi-kvsname kvs_27847_100_917537621_virt32a --pmi-spawner-kvsname kvs_27847_0_289500754_virt32a --pmi-process-mapping (vector,(0,1,4)) --global-inherited-env 100 'DEB_HOST_GNU_SYSTEM=linux-gnueabihf' 'SUDO_GID=113' 'OMPI_MCA_plm=isolated' 'DFLAGS=-frelease' 'DEB_BUILD_ARCH_BITS=32' 'DEB_TARGET_GNU_CPU=arm' 'MAIL=/var/mail/root' 'DEB_HOST_ARCH_OS=linux' 'LANGUAGE=en_US:en' 'USER=pbuilder1' 'OMPI_MCA_rmaps_base_oversubscribe=true' 'ASFLAGS_FOR_BUILD=' 'CXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_TYPE=arm-linux-gnueabihf' 'BUILDUSERNAME=pbuilder1' 'FFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_TARGET_MULTIARCH=arm-linux-gnueabihf' 'OBJCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DH_INTERNAL_OPTIONS=-O--buildsystem=pybuild' 'DEB_BUILD_ARCH_CPU=arm' 'SHLVL=2' 'DEB_HOST_ARCH_LIBC=gnu' 'DEB_HOST_ARCH_ABI=eabihf' 'OLDPWD=/' 'BUILDUSERGECOS=first user,first room,first work-phone,first home-phone,first other' 'HOME=/nonexistent/first-build' 'DEB_BUILD_ARCH_ENDIAN=little' 'DFLAGS_FOR_BUILD=-frelease' 'LDFLAGS=-Wl,-z,relro' 'DEB_TARGET_ARCH_BITS=32' 'DEB_BUILD_GNU_SYSTEM=linux-gnueabihf' 'MAKEFLAGS=w' 'PBUILDER_OPERATION=build' 'CXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'OMPI_MCA_btl_vader_single_copy_mechanism=none' 'MPI4PY_TEST_SPAWN=false' 'OBJCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_ARCH_OS=linux' 'DEB_TARGET_GNU_TYPE=arm-linux-gnueabihf' 'SUDO_UID=107' 'PBUILDER_PKGDATADIR=/usr/share/pbuilder' 'DEB_TARGET_ARCH_CPU=arm' 'https_proxy=http://127.0.0.1:9/' 'LOGNAME=pbuilder1' 'http_proxy=http://127.0.0.1:9/' 'DEB_BUILD_ARCH_LIBC=gnu' 'DEB_BUILD_ARCH_ABI=eabihf' 'PBUILDER_SYSCONFDIR=/etc' '_=/usr/bin/unshare' 'OMPI_MCA_btl_base_warn_component_unused=false' 'DEB_HOST_ARCH=armhf' 'LDFLAGS_FOR_BUILD=-Wl,-z,relro' 'DEB_TARGET_ARCH_ENDIAN=little' 'TERM=unknown' 'DH_INTERNAL_OVERRIDE=dh_auto_test' 'DEB_HOST_GNU_CPU=arm' 'DEB_TARGET_GNU_SYSTEM=linux-gnueabihf' 'PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' 'INVOCATION_ID=610244979ed544ba900474fb6a7bad85' 'DEB_TARGET_ARCH_OS=linux' 'CFLAGS=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'MAKELEVEL=2' 'DEB_HOST_MULTIARCH=arm-linux-gnueabihf' 'SOURCE_DATE_EPOCH=1727596276' 'FCFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'OBJCXXFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'LANG=C' 'LD_PRELOAD=' 'DEB_TARGET_ARCH_LIBC=gnu' 'DEB_TARGET_ARCH_ABI=eabihf' 'PBUILDER_PKGLIBDIR=/usr/lib/pbuilder' 'DEB_BUILD_OPTIONS=buildinfo=+all reproducible=+all parallel=3 ' 'SUDO_COMMAND=/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/pbuilderrc_TY97 --distribution trixie --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1 --logfile b1/build.log mpi4py_4.0.0-8.dsc' 'CPPFLAGS=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'DEBIAN_FRONTEND=noninteractive' 'DH_INTERNAL_BUILDFLAGS=1' 'SHELL=/bin/bash' 'GITHUB_ACTIONS=true' 'DEB_HOST_ARCH_BITS=32' 'DEB_BUILD_ARCH=armhf' 'SUDO_USER=jenkins' 'CFLAGS_FOR_BUILD=-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'DEB_BUILD_GNU_CPU=arm' 'ASFLAGS=' 'DEB_HOST_GNU_TYPE=arm-linux-gnueabihf' 'FCFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'LC_ALL=C' 'OBJCXXFLAGS_FOR_BUILD=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection -Wformat -Werror=format-security' 'PWD=/build/reproducible-path/mpi4py-4.0.0' 'DEB_HOST_ARCH_CPU=arm' 'DEB_RULES_REQUIRES_ROOT=no' 'FFLAGS=-g -O2 -ffile-prefix-map=/build/reproducible-path/mpi4py-4.0.0=. -fstack-protector-strong -fstack-clash-protection' 'DEB_BUILD_MULTIARCH=arm-linux-gnueabihf' 'CPPFLAGS_FOR_BUILD=-D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64 -D_TIME_BITS=64 -Wdate-time -D_FORTIFY_SOURCE=2' 'PYTHONPATH=/build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build' 'MFLAGS=-w' 'TZ=/usr/share/zoneinfo/Etc/GMT+12' 'PBCURRENTCOMMANDLINEOPERATION=build' 'DEB_HOST_ARCH_ENDIAN=little' 'DEB_TARGET_ARCH=armhf' --global-user-env 0 --global-system-env 1 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --proxy-core-count 1 --exec --exec-appnum 0 --exec-proc-count 4 --exec-local-env 1 'PMI_SPAWNED=1' --exec-wdir /build/reproducible-path/mpi4py-4.0.0 --exec-args 1 /tmp/mpi4py-1y3r8a0z.py [mpiexec@virt32a] Sending internal PMI command (proxy:0:0): cmd=spawn_result rc=0 [proxy:0@virt32a] got pmi command from downstream 99-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 99-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 99-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PID_LIST [proxy:0@virt32a] we don't understand the response spawn_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 99-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_100_917537621_virt32a [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_100_917537621_virt32a [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 100] got PMI command: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:100:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 100] got PMI command: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:100:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=put kvsname=kvs_27847_100_917537621_virt32a key=-bcast-1-0 value=2F6465762F73686D2F6D706963685F736861725F746D7032646A796C6800 [proxy:0@virt32a] cached command: -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7032646A796C6800 [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_100_917537621_virt32a [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=init pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] Sending PMI command: cmd=response_to_init rc=0 pmi_version=1 pmi_subversion=1 [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=get_maxes [proxy:0@virt32a] Sending PMI command: cmd=maxes rc=0 kvsname_max=256 keylen_max=64 vallen_max=1024 [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=get_appnum [proxy:0@virt32a] Sending PMI command: cmd=appnum rc=0 appnum=0 [mpiexec@virt32a] [pgid: 100] got PMI command: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:100:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=get_my_kvsname [proxy:0@virt32a] Sending PMI command: cmd=my_kvsname rc=0 kvsname=kvs_27847_100_917537621_virt32a [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_process_mapping [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=(vector,(0,1,4)) found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_hwloc_xmlfile [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=/tmp/hydra_hwloc_xmlfile_gHmcm1 found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream internal PMI command: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_mpi_memory_alloc_kinds [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 100] got PMI command: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PMI_mpi_memory_alloc_kinds [mpiexec@virt32a] Sending internal PMI command (proxy:100:0): cmd=get_result rc=1 [proxy:0@virt32a] we don't understand the response get_result; forwarding downstream [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=barrier_in [proxy:0@virt32a] flushing 1 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7032646A796C6800 [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7032646A796C6800 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 100] got PMI command: cmd=mput -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7032646A796C6800 [mpiexec@virt32a] [pgid: 100] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:100:0): cmd=keyval_cache PARENT_ROOT_PORT_NAME=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ -bcast-1-0=2F6465762F73686D2F6D706963685F736861725F746D7032646A796C6800 [mpiexec@virt32a] Sending internal PMI command (proxy:100:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7032646A796C6800 found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7032646A796C6800 found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=-bcast-1-0 [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=2F6465762F73686D2F6D706963685F736861725F746D7032646A796C6800 found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=put kvsname=kvs_27847_100_917537621_virt32a key=P2-businesscard value=description#virt32a$port#50379$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P2-businesscard=description#virt32a$port#50379$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=put kvsname=kvs_27847_100_917537621_virt32a key=P3-businesscard value=description#virt32a$port#44939$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P3-businesscard=description#virt32a$port#44939$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=put kvsname=kvs_27847_100_917537621_virt32a key=P0-businesscard value=description#virt32a$port#52597$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P0-businesscard=description#virt32a$port#52597$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=put kvsname=kvs_27847_100_917537621_virt32a key=P1-businesscard value=description#virt32a$port#41141$ifname#127.0.1.1$ [proxy:0@virt32a] cached command: P1-businesscard=description#virt32a$port#41141$ifname#127.0.1.1$ [proxy:0@virt32a] Sending PMI command: cmd=put_result rc=0 [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=barrier_in [proxy:0@virt32a] flushing 4 put command(s) out [proxy:0@virt32a] forwarding command upstream: cmd=mput P2-businesscard=description#virt32a$port#50379$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#44939$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#52597$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#41141$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream internal PMI command: cmd=mput P2-businesscard=description#virt32a$port#50379$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#44939$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#52597$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#41141$ifname#127.0.1.1$ [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 100] got PMI command: cmd=mput P2-businesscard=description#virt32a$port#50379$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#44939$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#52597$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#41141$ifname#127.0.1.1$ [mpiexec@virt32a] [pgid: 100] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:100:0): cmd=keyval_cache P2-businesscard=description#virt32a$port#50379$ifname#127.0.1.1$ P3-businesscard=description#virt32a$port#44939$ifname#127.0.1.1$ P0-businesscard=description#virt32a$port#52597$ifname#127.0.1.1$ P1-businesscard=description#virt32a$port#41141$ifname#127.0.1.1$ [mpiexec@virt32a] Sending internal PMI command (proxy:100:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=barrier_in [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=barrier_in [proxy:0@virt32a] Sending upstream internal PMI command: cmd=barrier_in [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_PMI [mpiexec@virt32a] [pgid: 100] got PMI command: cmd=barrier_in [mpiexec@virt32a] Sending internal PMI command (proxy:100:0): cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] Sending PMI command: cmd=barrier_out [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=PARENT_ROOT_PORT_NAME [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=tag#0$description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#52597$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#41141$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#50379$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=get kvsname=kvs_27847_100_917537621_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#44939$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P0-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#45101$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P1-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#34485$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P2-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#35709$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=get kvsname=kvs_27847_0_289500754_virt32a key=P3-businesscard [proxy:0@virt32a] Sending PMI command: cmd=get_result rc=0 value=description#virt32a$port#47245$ifname#127.0.1.1$ found=TRUE [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_status.TestStatus.testConstructor) ... ok testCopyConstructor (test_status.TestStatus.testCopyConstructor) ... ok testDefaultFieldValues (test_status.TestStatus.testDefaultFieldValues) ... ok testGetCount (test_status.TestStatus.testGetCount) ... ok testGetElements (test_status.TestStatus.testGetElements) ... ok testIsCancelled (test_status.TestStatus.testIsCancelled) ... ok testPickle (test_status.TestStatus.testPickle) ... ok testPyProps (test_status.TestStatus.testPyProps) ... ok testSetCancelled (test_status.TestStatus.testSetCancelled) ... ok testSetElements (test_status.TestStatus.testSetElements) ... ok testCloneFree (test_subclass.TestMyCartcommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyCartcommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyCartcommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyCartcommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyCartcommWORLD.testCloneFree) ... ok testConstructor (test_status.TestStatus.testConstructor) ... ok testCopyConstructor (test_status.TestStatus.testCopyConstructor) ... ok testDefaultFieldValues (test_status.TestStatus.testDefaultFieldValues) ... ok testGetCount (test_status.TestStatus.testGetCount) ... ok testGetElements (test_status.TestStatus.testGetElements) ... ok testIsCancelled (test_status.TestStatus.testIsCancelled) ... ok testPickle (test_status.TestStatus.testPickle) ... ok testPyProps (test_status.TestStatus.testPyProps) ... ok testSetCancelled (test_status.TestStatus.testSetCancelled) ... ok testSetElements (test_status.TestStatus.testSetElements) ... ok testCloneFree (test_subclass.TestMyCartcommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyCartcommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyCartcommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyCartcommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyCartcommWORLD.testCloneFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_status.TestStatus.testConstructor) ... ok testConstructor (test_status.TestStatus.testConstructor) ... ok testCopyConstructor (test_status.TestStatus.testCopyConstructor) ... ok testDefaultFieldValues (test_status.TestStatus.testDefaultFieldValues) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetCount (test_status.TestStatus.testGetCount) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetElements (test_status.TestStatus.testGetElements) ... ok testIsCancelled (test_status.TestStatus.testIsCancelled) ... ok testPickle (test_status.TestStatus.testPickle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testPyProps (test_status.TestStatus.testPyProps) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCopyConstructor (test_status.TestStatus.testCopyConstructor) ... ok testDefaultFieldValues (test_status.TestStatus.testDefaultFieldValues) ... ok testSetCancelled (test_status.TestStatus.testSetCancelled) ... ok testSetElements (test_status.TestStatus.testSetElements) ... ok testCloneFree (test_subclass.TestMyCartcommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyCartcommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyCartcommSELF.testCloneFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetCount (test_status.TestStatus.testGetCount) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubType (test_subclass.TestMyCartcommSELF.testSubType) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCloneFree (test_subclass.TestMyCartcommWORLD.testCloneFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] ok testGetElements (test_status.TestStatus.testGetElements) ... ok testIsCancelled (test_status.TestStatus.testIsCancelled) ... ok testPickle (test_status.TestStatus.testPickle) ... ok testPyProps (test_status.TestStatus.testPyProps) ... ok testSetCancelled (test_status.TestStatus.testSetCancelled) ... ok testSetElements (test_status.TestStatus.testSetElements) ... Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCloneFree (test_subclass.TestMyCartcommNULL.testCloneFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubType (test_subclass.TestMyCartcommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyCartcommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyCartcommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyCartcommWORLD.testCloneFree) ... [proxy:0@virt32a] got pmi command from downstream 100-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 100-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 100-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubType (test_subclass.TestMyCartcommWORLD.testSubType) ... ok testCloneFree (test_subclass.TestMyCommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyCommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyCommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyCommSELF.testSubType) ... ok testSubType (test_subclass.TestMyCartcommWORLD.testSubType) ... ok testCloneFree (test_subclass.TestMyCommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyCommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyCommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyCommSELF.testSubType) ... ok ok testSubType (test_subclass.TestMyCartcommWORLD.testSubType) ... ok testCloneFree (test_subclass.TestMyCommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyCommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyCommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyCommSELF.testSubType) ... ok testSubType (test_subclass.TestMyCartcommWORLD.testSubType) ... ok testCloneFree (test_subclass.TestMyCommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyCommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyCommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyCommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyCommWORLD.testCloneFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCloneFree (test_subclass.TestMyCommWORLD.testCloneFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] got pmi command from downstream 100-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 ok testCloneFree (test_subclass.TestMyCommWORLD.testCloneFree) ... ok testCloneFree (test_subclass.TestMyCommWORLD.testCloneFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubType (test_subclass.TestMyCommWORLD.testSubType) ... ok testFree (test_subclass.TestMyFile.testFree) ... ok testSubType (test_subclass.TestMyFile.testSubType) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubType (test_subclass.TestMyCommWORLD.testSubType) ... ok testFree (test_subclass.TestMyFile.testFree) ... ok testSubType (test_subclass.TestMyFile.testSubType) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubType (test_subclass.TestMyCommWORLD.testSubType) ... ok testFree (test_subclass.TestMyFile.testFree) ... ok testSubType (test_subclass.TestMyFile.testSubType) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubType (test_subclass.TestMyCommWORLD.testSubType) ... ok testFree (test_subclass.TestMyFile.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCloneFree (test_subclass.TestMyGraphcommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyGraphcommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyGraphcommWORLD.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommWORLD.testSubType) ... ok testSubType (test_subclass.TestMyGrequest.testSubType) ... ok testCreateDupType (test_subclass.TestMyInfo.testCreateDupType) ... ok ok testCloneFree (test_subclass.TestMyGraphcommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyGraphcommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyGraphcommWORLD.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommWORLD.testSubType) ... ok testSubType (test_subclass.TestMyGrequest.testSubType) ... ok testCreateDupType (test_subclass.TestMyInfo.testCreateDupType) ... ok ok testCloneFree (test_subclass.TestMyGraphcommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyGraphcommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyGraphcommWORLD.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommWORLD.testSubType) ... ok testSubType (test_subclass.TestMyGrequest.testSubType) ... ok testCreateDupType (test_subclass.TestMyInfo.testCreateDupType) ... ok testCreateEnvType (test_subclass.TestMyInfo.testCreateEnvType) ... ok testSubType (test_subclass.TestMyFile.testSubType) ... ok testCloneFree (test_subclass.TestMyGraphcommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyGraphcommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyGraphcommWORLD.testCloneFree) ... ok testSubType (test_subclass.TestMyGraphcommWORLD.testSubType) ... ok testSubType (test_subclass.TestMyGrequest.testSubType) ... ok testCreateDupType (test_subclass.TestMyInfo.testCreateDupType) ... ok testCreateEnvType (test_subclass.TestMyInfo.testCreateEnvType) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testCreateEnvType (test_subclass.TestMyInfo.testCreateEnvType) ... ok testFree (test_subclass.TestMyInfo.testFree) ... ok testPickle (test_subclass.TestMyInfo.testPickle) ... ok testSubType (test_subclass.TestMyInfo.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommWORLD.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommWORLD.testSubType) ... ok testCreateEnvType (test_subclass.TestMyInfo.testCreateEnvType) ... ok testFree (test_subclass.TestMyInfo.testFree) ... ok testPickle (test_subclass.TestMyInfo.testPickle) ... ok testSubType (test_subclass.TestMyInfo.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommWORLD.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommWORLD.testSubType) ... ok ok testFree (test_subclass.TestMyInfo.testFree) ... ok testPickle (test_subclass.TestMyInfo.testPickle) ... ok testSubType (test_subclass.TestMyInfo.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommWORLD.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommWORLD.testSubType) ... ok ok testFree (test_subclass.TestMyInfo.testFree) ... ok testPickle (test_subclass.TestMyInfo.testPickle) ... ok testSubType (test_subclass.TestMyInfo.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommNULL.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommNULL.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommSELF.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommSELF.testSubType) ... ok testCloneFree (test_subclass.TestMyIntracommWORLD.testCloneFree) ... ok testSubType (test_subclass.TestMyIntracommWORLD.testSubType) ... ok testSubType (test_subclass.TestMyPrequest.testSubType) ... ok testSubType (test_subclass.TestMyPrequest.testSubType) ... ok testSubType (test_subclass.TestMyPrequest.testSubType) ... ok testStart (test_subclass.TestMyPrequest2.testStart) ... testStart (test_subclass.TestMyPrequest2.testStart) ... testStart (test_subclass.TestMyPrequest2.testStart) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubType (test_subclass.TestMyPrequest2.testSubType) ... ok testSubType (test_subclass.TestMyRequest.testSubType) ... ok testSubType (test_subclass.TestMyRequest2.testSubType) ... ok testFree (test_subclass.TestMyWin.testFree) ... ok testSubType (test_subclass.TestMyWin.testSubType) ... ok testIsThreadMain (test_threads.TestMPIThreads.testIsThreadMain) ... ok testIsThreadMainInThread (test_threads.TestMPIThreads.testIsThreadMainInThread) ... ok testSubType (test_subclass.TestMyPrequest2.testSubType) ... ok testSubType (test_subclass.TestMyRequest.testSubType) ... ok testSubType (test_subclass.TestMyRequest2.testSubType) ... ok testFree (test_subclass.TestMyWin.testFree) ... ok testSubType (test_subclass.TestMyWin.testSubType) ... ok testIsThreadMain (test_threads.TestMPIThreads.testIsThreadMain) ... ok testIsThreadMainInThread (test_threads.TestMPIThreads.testIsThreadMainInThread) ... ok testSubType (test_subclass.TestMyPrequest2.testSubType) ... ok testSubType (test_subclass.TestMyRequest.testSubType) ... ok testSubType (test_subclass.TestMyRequest2.testSubType) ... ok testFree (test_subclass.TestMyWin.testFree) ... ok testSubType (test_subclass.TestMyWin.testSubType) ... ok testIsThreadMain (test_threads.TestMPIThreads.testIsThreadMain) ... ok testIsThreadMainInThread (test_threads.TestMPIThreads.testIsThreadMainInThread) ... testSubType (test_subclass.TestMyPrequest.testSubType) ... ok testStart (test_subclass.TestMyPrequest2.testStart) ... ok testSubType (test_subclass.TestMyPrequest2.testSubType) ... ok testSubType (test_subclass.TestMyRequest.testSubType) ... ok testSubType (test_subclass.TestMyRequest2.testSubType) ... ok testFree (test_subclass.TestMyWin.testFree) ... ok testSubType (test_subclass.TestMyWin.testSubType) ... ok testIsThreadMain (test_threads.TestMPIThreads.testIsThreadMain) ... ok testIsThreadMainInThread (test_threads.TestMPIThreads.testIsThreadMainInThread) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testThreadLevels (test_threads.TestMPIThreads.testThreadLevels) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testGetConfig (test_toplevel.TestConfig.testGetConfig) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetInclude (test_toplevel.TestConfig.testGetInclude) ... ok testProfile (test_toplevel.TestProfile.testProfile) ... ok testThreadLevels (test_threads.TestMPIThreads.testThreadLevels) ... ok testGetConfig (test_toplevel.TestConfig.testGetConfig) ... ok testGetInclude (test_toplevel.TestConfig.testGetInclude) ... ok testProfile (test_toplevel.TestProfile.testProfile) ... ok testThreadLevels (test_threads.TestMPIThreads.testThreadLevels) ... ok testGetConfig (test_toplevel.TestConfig.testGetConfig) ... ok testGetInclude (test_toplevel.TestConfig.testGetInclude) ... ok testProfile (test_toplevel.TestProfile.testProfile) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testThreadLevels (test_threads.TestMPIThreads.testThreadLevels) ... ok testGetConfig (test_toplevel.TestConfig.testGetConfig) ... ok testGetInclude (test_toplevel.TestConfig.testGetInclude) ... ok testProfile (test_toplevel.TestProfile.testProfile) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBadAttribute (test_toplevel.TestRC.testBadAttribute) ... ok testCallKwArgs (test_toplevel.TestRC.testCallKwArgs) ... ok testInitKwArgs (test_toplevel.TestRC.testInitKwArgs) ... ok testRepr (test_toplevel.TestRC.testRepr) ... ok ok testBadAttribute (test_toplevel.TestRC.testBadAttribute) ... ok testCallKwArgs (test_toplevel.TestRC.testCallKwArgs) ... ok testInitKwArgs (test_toplevel.TestRC.testInitKwArgs) ... ok testRepr (test_toplevel.TestRC.testRepr) ... ok ok testBadAttribute (test_toplevel.TestRC.testBadAttribute) ... ok testCallKwArgs (test_toplevel.TestRC.testCallKwArgs) ... ok testInitKwArgs (test_toplevel.TestRC.testInitKwArgs) ... ok testRepr (test_toplevel.TestRC.testRepr) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAckFailed (test_ulfm.TestULFMInter.testAckFailed) ... ok testAgree (test_ulfm.TestULFMInter.testAgree) ... ok testBadAttribute (test_toplevel.TestRC.testBadAttribute) ... ok testCallKwArgs (test_toplevel.TestRC.testCallKwArgs) ... ok testInitKwArgs (test_toplevel.TestRC.testInitKwArgs) ... ok testRepr (test_toplevel.TestRC.testRepr) ... ok testAckFailed (test_ulfm.TestULFMInter.testAckFailed) ... ok testAgree (test_ulfm.TestULFMInter.testAgree) ... testAckFailed (test_ulfm.TestULFMInter.testAckFailed) ... ok testAgree (test_ulfm.TestULFMInter.testAgree) ... testAckFailed (test_ulfm.TestULFMInter.testAckFailed) ... ok testAgree (test_ulfm.TestULFMInter.testAgree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetFailed (test_ulfm.TestULFMInter.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMInter.testIAgree) ... ok testIShrink (test_ulfm.TestULFMInter.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMInter.testIsRevoked) ... ok testRevoke (test_ulfm.TestULFMInter.testRevoke) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMInter.testShrink) ... ok testAckFailed (test_ulfm.TestULFMSelf.testAckFailed) ... ok testGetFailed (test_ulfm.TestULFMInter.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMInter.testIAgree) ... ok testIShrink (test_ulfm.TestULFMInter.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMInter.testIsRevoked) ... ok testRevoke (test_ulfm.TestULFMInter.testRevoke) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMInter.testShrink) ... ok testAckFailed (test_ulfm.TestULFMSelf.testAckFailed) ... ok testGetFailed (test_ulfm.TestULFMInter.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMInter.testIAgree) ... ok testIShrink (test_ulfm.TestULFMInter.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMInter.testIsRevoked) ... ok testRevoke (test_ulfm.TestULFMInter.testRevoke) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMInter.testShrink) ... ok ok testGetFailed (test_ulfm.TestULFMInter.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMInter.testIAgree) ... ok testIShrink (test_ulfm.TestULFMInter.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMInter.testIsRevoked) ... ok testRevoke (test_ulfm.TestULFMInter.testRevoke) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMInter.testShrink) ... ok testAckFailed (test_ulfm.TestULFMSelf.testAckFailed) ... ok ok testAckFailed (test_ulfm.TestULFMSelf.testAckFailed) ... testAgree (test_ulfm.TestULFMSelf.testAgree) ... testAgree (test_ulfm.TestULFMSelf.testAgree) ... ok testGetFailed (test_ulfm.TestULFMSelf.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMSelf.testIAgree) ... ok testIShrink (test_ulfm.TestULFMSelf.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMSelf.testIsRevoked) ... ok testRevoke (test_ulfm.TestULFMSelf.testRevoke) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMSelf.testShrink) ... ok testAckFailed (test_ulfm.TestULFMWorld.testAckFailed) ... ok testAgree (test_ulfm.TestULFMWorld.testAgree) ... ok testGetFailed (test_ulfm.TestULFMWorld.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMWorld.testIAgree) ... ok testIShrink (test_ulfm.TestULFMWorld.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMWorld.testIsRevoked) ... ok testGetFailed (test_ulfm.TestULFMSelf.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMSelf.testIAgree) ... ok testIShrink (test_ulfm.TestULFMSelf.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMSelf.testIsRevoked) ... ok testRevoke (test_ulfm.TestULFMSelf.testRevoke) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMSelf.testShrink) ... ok testAckFailed (test_ulfm.TestULFMWorld.testAckFailed) ... ok testAgree (test_ulfm.TestULFMWorld.testAgree) ... ok testGetFailed (test_ulfm.TestULFMWorld.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMWorld.testIAgree) ... ok testIShrink (test_ulfm.TestULFMWorld.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMWorld.testIsRevoked) ... ok ok testAgree (test_ulfm.TestULFMSelf.testAgree) ... ok testGetFailed (test_ulfm.TestULFMSelf.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMSelf.testIAgree) ... ok testIShrink (test_ulfm.TestULFMSelf.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMSelf.testIsRevoked) ... ok testRevoke (test_ulfm.TestULFMSelf.testRevoke) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMSelf.testShrink) ... ok testAckFailed (test_ulfm.TestULFMWorld.testAckFailed) ... ok testAgree (test_ulfm.TestULFMWorld.testAgree) ... ok testGetFailed (test_ulfm.TestULFMWorld.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMWorld.testIAgree) ... ok testIShrink (test_ulfm.TestULFMWorld.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMWorld.testIsRevoked) ... ok ok testAgree (test_ulfm.TestULFMSelf.testAgree) ... ok testGetFailed (test_ulfm.TestULFMSelf.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMSelf.testIAgree) ... ok testIShrink (test_ulfm.TestULFMSelf.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMSelf.testIsRevoked) ... ok testRevoke (test_ulfm.TestULFMSelf.testRevoke) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMSelf.testShrink) ... ok testAckFailed (test_ulfm.TestULFMWorld.testAckFailed) ... ok testAgree (test_ulfm.TestULFMWorld.testAgree) ... ok testGetFailed (test_ulfm.TestULFMWorld.testGetFailed) ... ok testIAgree (test_ulfm.TestULFMWorld.testIAgree) ... ok testIShrink (test_ulfm.TestULFMWorld.testIShrink) ... ok testIsRevoked (test_ulfm.TestULFMWorld.testIsRevoked) ... ok ok testRevoke (test_ulfm.TestULFMWorld.testRevoke) ... testRevoke (test_ulfm.TestULFMWorld.testRevoke) ... testRevoke (test_ulfm.TestULFMWorld.testRevoke) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMWorld.testShrink) ... ok testAlignmentComplex (test_util_dtlib.TestUtilDTLib.testAlignmentComplex) ... ok testAlignmentPair (test_util_dtlib.TestUtilDTLib.testAlignmentPair) ... ok testAlignmentStruct (test_util_dtlib.TestUtilDTLib.testAlignmentStruct) ... ok testBasic (test_util_dtlib.TestUtilDTLib.testBasic) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMWorld.testShrink) ... ok testAlignmentComplex (test_util_dtlib.TestUtilDTLib.testAlignmentComplex) ... ok testAlignmentPair (test_util_dtlib.TestUtilDTLib.testAlignmentPair) ... ok testAlignmentStruct (test_util_dtlib.TestUtilDTLib.testAlignmentStruct) ... ok testBasic (test_util_dtlib.TestUtilDTLib.testBasic) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMWorld.testShrink) ... ok testAlignmentComplex (test_util_dtlib.TestUtilDTLib.testAlignmentComplex) ... ok testAlignmentPair (test_util_dtlib.TestUtilDTLib.testAlignmentPair) ... ok testAlignmentStruct (test_util_dtlib.TestUtilDTLib.testAlignmentStruct) ... ok testBasic (test_util_dtlib.TestUtilDTLib.testBasic) ... testRevoke (test_ulfm.TestULFMWorld.testRevoke) ... skipped 'mpi-comm_revoke' testShrink (test_ulfm.TestULFMWorld.testShrink) ... ok testAlignmentComplex (test_util_dtlib.TestUtilDTLib.testAlignmentComplex) ... ok testAlignmentPair (test_util_dtlib.TestUtilDTLib.testAlignmentPair) ... ok testAlignmentStruct (test_util_dtlib.TestUtilDTLib.testAlignmentStruct) ... ok testBasic (test_util_dtlib.TestUtilDTLib.testBasic) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] ok testF77 (test_util_dtlib.TestUtilDTLib.testF77) ... ok testF90 (test_util_dtlib.TestUtilDTLib.testF90) ... ok testF90Complex (test_util_dtlib.TestUtilDTLib.testF90Complex) ... Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF77 (test_util_dtlib.TestUtilDTLib.testF77) ... ok testF90 (test_util_dtlib.TestUtilDTLib.testF90) ... ok testF77 (test_util_dtlib.TestUtilDTLib.testF77) ... ok testF90Integer (test_util_dtlib.TestUtilDTLib.testF90Integer) ... ok testF90Complex (test_util_dtlib.TestUtilDTLib.testF90Complex) ... ok testF90Integer (test_util_dtlib.TestUtilDTLib.testF90Integer) ... ok testF77 (test_util_dtlib.TestUtilDTLib.testF77) ... ok testF90 (test_util_dtlib.TestUtilDTLib.testF90) ... ok testF90Complex (test_util_dtlib.TestUtilDTLib.testF90Complex) ... ok testF90Integer (test_util_dtlib.TestUtilDTLib.testF90Integer) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90 (test_util_dtlib.TestUtilDTLib.testF90) ... ok testF90Complex (test_util_dtlib.TestUtilDTLib.testF90Complex) ... ok testF90Integer (test_util_dtlib.TestUtilDTLib.testF90Integer) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90Real (test_util_dtlib.TestUtilDTLib.testF90Real) ... ok testF90Real (test_util_dtlib.TestUtilDTLib.testF90Real) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFailures (test_util_dtlib.TestUtilDTLib.testFailures) ... ok testHIndexed (test_util_dtlib.TestUtilDTLib.testHIndexed) ... ok testFailures (test_util_dtlib.TestUtilDTLib.testFailures) ... ok testF90Real (test_util_dtlib.TestUtilDTLib.testF90Real) ... ok testFailures (test_util_dtlib.TestUtilDTLib.testFailures) ... ok testHIndexed (test_util_dtlib.TestUtilDTLib.testHIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHIndexed (test_util_dtlib.TestUtilDTLib.testHIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testF90Real (test_util_dtlib.TestUtilDTLib.testF90Real) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFailures (test_util_dtlib.TestUtilDTLib.testFailures) ... ok testHIndexed (test_util_dtlib.TestUtilDTLib.testHIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHVector (test_util_dtlib.TestUtilDTLib.testHVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHVector (test_util_dtlib.TestUtilDTLib.testHVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHVector (test_util_dtlib.TestUtilDTLib.testHVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testHVector (test_util_dtlib.TestUtilDTLib.testHVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_util_dtlib.TestUtilDTLib.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_util_dtlib.TestUtilDTLib.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_util_dtlib.TestUtilDTLib.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIndexed (test_util_dtlib.TestUtilDTLib.testIndexed) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] ok testMissingNumPy (test_util_dtlib.TestUtilDTLib.testMissingNumPy) ... Sending upstream hdr.cmd = CMD_STDERR ok testPair (test_util_dtlib.TestUtilDTLib.testPair) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPairStruct (test_util_dtlib.TestUtilDTLib.testPairStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMissingNumPy (test_util_dtlib.TestUtilDTLib.testMissingNumPy) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPair (test_util_dtlib.TestUtilDTLib.testPair) ... ok testStruct1 (test_util_dtlib.TestUtilDTLib.testStruct1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPairStruct (test_util_dtlib.TestUtilDTLib.testPairStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMissingNumPy (test_util_dtlib.TestUtilDTLib.testMissingNumPy) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPair (test_util_dtlib.TestUtilDTLib.testPair) ... ok testPairStruct (test_util_dtlib.TestUtilDTLib.testPairStruct) ... ok testMissingNumPy (test_util_dtlib.TestUtilDTLib.testMissingNumPy) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPair (test_util_dtlib.TestUtilDTLib.testPair) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPairStruct (test_util_dtlib.TestUtilDTLib.testPairStruct) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct1 (test_util_dtlib.TestUtilDTLib.testStruct1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct1 (test_util_dtlib.TestUtilDTLib.testStruct1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct1 (test_util_dtlib.TestUtilDTLib.testStruct1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct2 (test_util_dtlib.TestUtilDTLib.testStruct2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct2 (test_util_dtlib.TestUtilDTLib.testStruct2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct2 (test_util_dtlib.TestUtilDTLib.testStruct2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct2 (test_util_dtlib.TestUtilDTLib.testStruct2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct3 (test_util_dtlib.TestUtilDTLib.testStruct3) ... ok testStruct4 (test_util_dtlib.TestUtilDTLib.testStruct4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct5 (test_util_dtlib.TestUtilDTLib.testStruct5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct3 (test_util_dtlib.TestUtilDTLib.testStruct3) ... ok testStruct4 (test_util_dtlib.TestUtilDTLib.testStruct4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct5 (test_util_dtlib.TestUtilDTLib.testStruct5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct3 (test_util_dtlib.TestUtilDTLib.testStruct3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct4 (test_util_dtlib.TestUtilDTLib.testStruct4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct3 (test_util_dtlib.TestUtilDTLib.testStruct3) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct4 (test_util_dtlib.TestUtilDTLib.testStruct4) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct5 (test_util_dtlib.TestUtilDTLib.testStruct5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStruct5 (test_util_dtlib.TestUtilDTLib.testStruct5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray1 (test_util_dtlib.TestUtilDTLib.testSubarray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray2 (test_util_dtlib.TestUtilDTLib.testSubarray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testVector (test_util_dtlib.TestUtilDTLib.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray1 (test_util_dtlib.TestUtilDTLib.testSubarray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray2 (test_util_dtlib.TestUtilDTLib.testSubarray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherInter (test_util_pkl5.TestMPISelf.testAllgatherInter) ... skipped 'comm.size==1' testAllgatherIntra (test_util_pkl5.TestMPISelf.testAllgatherIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallInter (test_util_pkl5.TestMPISelf.testAlltoallInter) ... skipped 'comm.size==1' testAlltoallIntra (test_util_pkl5.TestMPISelf.testAlltoallIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBSendAndRecv (test_util_pkl5.TestMPISelf.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestMPISelf.testBcastInter) ... skipped 'comm.size==1' testBcastIntra (test_util_pkl5.TestMPISelf.testBcastIntra) ... ok testBigMPI (test_util_pkl5.TestMPISelf.testBigMPI) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR skipped 'comm.size==1' testGatherInter (test_util_pkl5.TestMPISelf.testGatherInter) ... skipped 'comm.size==1' testGatherIntra (test_util_pkl5.TestMPISelf.testGatherIntra) ... ok testGetStatusAll (test_util_pkl5.TestMPISelf.testGetStatusAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIBSendAndRecv (test_util_pkl5.TestMPISelf.testIBSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_util_pkl5.TestMPISelf.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_util_pkl5.TestMPISelf.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendCancel (test_util_pkl5.TestMPISelf.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestMPISelf.testISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIrecv (test_util_pkl5.TestMPISelf.testIrecv) ... ok testMProbe (test_util_pkl5.TestMPISelf.testMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessage (test_util_pkl5.TestMPISelf.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestMPISelf.testMessageProbeIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPingPong01 (test_util_pkl5.TestMPISelf.testPingPong01) ... ok testProbe (test_util_pkl5.TestMPISelf.testProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRequest (test_util_pkl5.TestMPISelf.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestMPISelf.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestMPISelf.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestMPISelf.testScatterInter) ... skipped 'comm.size==1' testScatterIntra (test_util_pkl5.TestMPISelf.testScatterIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendAndRecv (test_util_pkl5.TestMPISelf.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestMPISelf.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testTestAll (test_util_pkl5.TestMPISelf.testTestAll) ... ok testWaitAll (test_util_pkl5.TestMPISelf.testWaitAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherInter (test_util_pkl5.TestMPIWorld.testAllgatherInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testVector (test_util_dtlib.TestUtilDTLib.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherInter (test_util_pkl5.TestMPISelf.testAllgatherInter) ... skipped 'comm.size==1' testAllgatherIntra (test_util_pkl5.TestMPISelf.testAllgatherIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallInter (test_util_pkl5.TestMPISelf.testAlltoallInter) ... skipped 'comm.size==1' testAlltoallIntra (test_util_pkl5.TestMPISelf.testAlltoallIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBSendAndRecv (test_util_pkl5.TestMPISelf.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestMPISelf.testBcastInter) ... skipped 'comm.size==1' testBcastIntra (test_util_pkl5.TestMPISelf.testBcastIntra) ... ok testBigMPI (test_util_pkl5.TestMPISelf.testBigMPI) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR skipped 'comm.size==1' testGatherInter (test_util_pkl5.TestMPISelf.testGatherInter) ... skipped 'comm.size==1' testGatherIntra (test_util_pkl5.TestMPISelf.testGatherIntra) ... ok testGetStatusAll (test_util_pkl5.TestMPISelf.testGetStatusAll) ... ok testIBSendAndRecv (test_util_pkl5.TestMPISelf.testIBSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_util_pkl5.TestMPISelf.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_util_pkl5.TestMPISelf.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendCancel (test_util_pkl5.TestMPISelf.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestMPISelf.testISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIrecv (test_util_pkl5.TestMPISelf.testIrecv) ... ok testMProbe (test_util_pkl5.TestMPISelf.testMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessage (test_util_pkl5.TestMPISelf.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestMPISelf.testMessageProbeIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPingPong01 (test_util_pkl5.TestMPISelf.testPingPong01) ... ok testProbe (test_util_pkl5.TestMPISelf.testProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRequest (test_util_pkl5.TestMPISelf.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestMPISelf.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestMPISelf.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestMPISelf.testScatterInter) ... skipped 'comm.size==1' testScatterIntra (test_util_pkl5.TestMPISelf.testScatterIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendAndRecv (test_util_pkl5.TestMPISelf.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestMPISelf.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testTestAll (test_util_pkl5.TestMPISelf.testTestAll) ... ok testWaitAll (test_util_pkl5.TestMPISelf.testWaitAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherInter (test_util_pkl5.TestMPIWorld.testAllgatherInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray1 (test_util_dtlib.TestUtilDTLib.testSubarray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray2 (test_util_dtlib.TestUtilDTLib.testSubarray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testVector (test_util_dtlib.TestUtilDTLib.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherInter (test_util_pkl5.TestMPISelf.testAllgatherInter) ... skipped 'comm.size==1' testAllgatherIntra (test_util_pkl5.TestMPISelf.testAllgatherIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallInter (test_util_pkl5.TestMPISelf.testAlltoallInter) ... skipped 'comm.size==1' testAlltoallIntra (test_util_pkl5.TestMPISelf.testAlltoallIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBSendAndRecv (test_util_pkl5.TestMPISelf.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestMPISelf.testBcastInter) ... skipped 'comm.size==1' testBcastIntra (test_util_pkl5.TestMPISelf.testBcastIntra) ... ok testBigMPI (test_util_pkl5.TestMPISelf.testBigMPI) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR skipped 'comm.size==1' testGatherInter (test_util_pkl5.TestMPISelf.testGatherInter) ... skipped 'comm.size==1' testGatherIntra (test_util_pkl5.TestMPISelf.testGatherIntra) ... ok testGetStatusAll (test_util_pkl5.TestMPISelf.testGetStatusAll) ... ok testIBSendAndRecv (test_util_pkl5.TestMPISelf.testIBSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_util_pkl5.TestMPISelf.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_util_pkl5.TestMPISelf.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] ok testISSendCancel (test_util_pkl5.TestMPISelf.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestMPISelf.testISendAndRecv) ... Sending upstream hdr.cmd = CMD_STDERR ok testIrecv (test_util_pkl5.TestMPISelf.testIrecv) ... ok testMProbe (test_util_pkl5.TestMPISelf.testMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessage (test_util_pkl5.TestMPISelf.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestMPISelf.testMessageProbeIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPingPong01 (test_util_pkl5.TestMPISelf.testPingPong01) ... ok testProbe (test_util_pkl5.TestMPISelf.testProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRequest (test_util_pkl5.TestMPISelf.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestMPISelf.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestMPISelf.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestMPISelf.testScatterInter) ... skipped 'comm.size==1' testScatterIntra (test_util_pkl5.TestMPISelf.testScatterIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendAndRecv (test_util_pkl5.TestMPISelf.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestMPISelf.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testTestAll (test_util_pkl5.TestMPISelf.testTestAll) ... ok testWaitAll (test_util_pkl5.TestMPISelf.testWaitAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherInter (test_util_pkl5.TestMPIWorld.testAllgatherInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray1 (test_util_dtlib.TestUtilDTLib.testSubarray1) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSubarray2 (test_util_dtlib.TestUtilDTLib.testSubarray2) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testVector (test_util_dtlib.TestUtilDTLib.testVector) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherInter (test_util_pkl5.TestMPISelf.testAllgatherInter) ... skipped 'comm.size==1' testAllgatherIntra (test_util_pkl5.TestMPISelf.testAllgatherIntra) ... ok testAlltoallInter (test_util_pkl5.TestMPISelf.testAlltoallInter) ... skipped 'comm.size==1' testAlltoallIntra (test_util_pkl5.TestMPISelf.testAlltoallIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBSendAndRecv (test_util_pkl5.TestMPISelf.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestMPISelf.testBcastInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR skipped 'comm.size==1' testBcastIntra (test_util_pkl5.TestMPISelf.testBcastIntra) ... ok testBigMPI (test_util_pkl5.TestMPISelf.testBigMPI) ... skipped 'comm.size==1' testGatherInter (test_util_pkl5.TestMPISelf.testGatherInter) ... skipped 'comm.size==1' testGatherIntra (test_util_pkl5.TestMPISelf.testGatherIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetStatusAll (test_util_pkl5.TestMPISelf.testGetStatusAll) ... ok testIBSendAndRecv (test_util_pkl5.TestMPISelf.testIBSendAndRecv) ... ok testIMProbe (test_util_pkl5.TestMPISelf.testIMProbe) ... ok testISSendAndRecv (test_util_pkl5.TestMPISelf.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendCancel (test_util_pkl5.TestMPISelf.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestMPISelf.testISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIrecv (test_util_pkl5.TestMPISelf.testIrecv) ... ok testMProbe (test_util_pkl5.TestMPISelf.testMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessage (test_util_pkl5.TestMPISelf.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestMPISelf.testMessageProbeIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPingPong01 (test_util_pkl5.TestMPISelf.testPingPong01) ... ok testProbe (test_util_pkl5.TestMPISelf.testProbe) ... ok testRequest (test_util_pkl5.TestMPISelf.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestMPISelf.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestMPISelf.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestMPISelf.testScatterInter) ... skipped 'comm.size==1' testScatterIntra (test_util_pkl5.TestMPISelf.testScatterIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendAndRecv (test_util_pkl5.TestMPISelf.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestMPISelf.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testTestAll (test_util_pkl5.TestMPISelf.testTestAll) ... ok testWaitAll (test_util_pkl5.TestMPISelf.testWaitAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherInter (test_util_pkl5.TestMPIWorld.testAllgatherInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherIntra (test_util_pkl5.TestMPIWorld.testAllgatherIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherIntra (test_util_pkl5.TestMPIWorld.testAllgatherIntra) ... ok testAllgatherIntra (test_util_pkl5.TestMPIWorld.testAllgatherIntra) ... ok testAllgatherIntra (test_util_pkl5.TestMPIWorld.testAllgatherIntra) ... ok testAlltoallInter (test_util_pkl5.TestMPIWorld.testAlltoallInter) ... ok testAlltoallInter (test_util_pkl5.TestMPIWorld.testAlltoallInter) ... ok testAlltoallInter (test_util_pkl5.TestMPIWorld.testAlltoallInter) ... ok testAlltoallInter (test_util_pkl5.TestMPIWorld.testAlltoallInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallIntra (test_util_pkl5.TestMPIWorld.testAlltoallIntra) ... ok testAlltoallIntra (test_util_pkl5.TestMPIWorld.testAlltoallIntra) ... ok testAlltoallIntra (test_util_pkl5.TestMPIWorld.testAlltoallIntra) ... ok testAlltoallIntra (test_util_pkl5.TestMPIWorld.testAlltoallIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBSendAndRecv (test_util_pkl5.TestMPIWorld.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestMPIWorld.testBcastInter) ... ok testBcastIntra (test_util_pkl5.TestMPIWorld.testBcastIntra) ... ok testBSendAndRecv (test_util_pkl5.TestMPIWorld.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestMPIWorld.testBcastInter) ... ok testBcastIntra (test_util_pkl5.TestMPIWorld.testBcastIntra) ... ok testBSendAndRecv (test_util_pkl5.TestMPIWorld.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestMPIWorld.testBcastInter) ... ok testBcastIntra (test_util_pkl5.TestMPIWorld.testBcastIntra) ... ok testBSendAndRecv (test_util_pkl5.TestMPIWorld.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestMPIWorld.testBcastInter) ... ok testBcastIntra (test_util_pkl5.TestMPIWorld.testBcastIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBigMPI (test_util_pkl5.TestMPIWorld.testBigMPI) ... ok testBigMPI (test_util_pkl5.TestMPIWorld.testBigMPI) ... ok testBigMPI (test_util_pkl5.TestMPIWorld.testBigMPI) ... ok testBigMPI (test_util_pkl5.TestMPIWorld.testBigMPI) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherInter (test_util_pkl5.TestMPIWorld.testGatherInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherInter (test_util_pkl5.TestMPIWorld.testGatherInter) ... ok testGatherInter (test_util_pkl5.TestMPIWorld.testGatherInter) ... ok testGatherInter (test_util_pkl5.TestMPIWorld.testGatherInter) ... ok testGatherIntra (test_util_pkl5.TestMPIWorld.testGatherIntra) ... ok testGatherIntra (test_util_pkl5.TestMPIWorld.testGatherIntra) ... ok testGatherIntra (test_util_pkl5.TestMPIWorld.testGatherIntra) ... ok testGatherIntra (test_util_pkl5.TestMPIWorld.testGatherIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetStatusAll (test_util_pkl5.TestMPIWorld.testGetStatusAll) ... ok testIBSendAndRecv (test_util_pkl5.TestMPIWorld.testIBSendAndRecv) ... ok testGetStatusAll (test_util_pkl5.TestMPIWorld.testGetStatusAll) ... ok testIBSendAndRecv (test_util_pkl5.TestMPIWorld.testIBSendAndRecv) ... ok testGetStatusAll (test_util_pkl5.TestMPIWorld.testGetStatusAll) ... ok testIBSendAndRecv (test_util_pkl5.TestMPIWorld.testIBSendAndRecv) ... ok testGetStatusAll (test_util_pkl5.TestMPIWorld.testGetStatusAll) ... ok testIBSendAndRecv (test_util_pkl5.TestMPIWorld.testIBSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_util_pkl5.TestMPIWorld.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_util_pkl5.TestMPIWorld.testIMProbe) ... ok testIMProbe (test_util_pkl5.TestMPIWorld.testIMProbe) ... ok testIMProbe (test_util_pkl5.TestMPIWorld.testIMProbe) ... ok testISSendAndRecv (test_util_pkl5.TestMPIWorld.testISSendAndRecv) ... ok testISSendAndRecv (test_util_pkl5.TestMPIWorld.testISSendAndRecv) ... ok testISSendAndRecv (test_util_pkl5.TestMPIWorld.testISSendAndRecv) ... ok testISSendAndRecv (test_util_pkl5.TestMPIWorld.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendCancel (test_util_pkl5.TestMPIWorld.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestMPIWorld.testISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendCancel (test_util_pkl5.TestMPIWorld.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestMPIWorld.testISendAndRecv) ... ok testISSendCancel (test_util_pkl5.TestMPIWorld.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestMPIWorld.testISendAndRecv) ... ok testISSendCancel (test_util_pkl5.TestMPIWorld.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestMPIWorld.testISendAndRecv) ... ok testIrecv (test_util_pkl5.TestMPIWorld.testIrecv) ... ok testMProbe (test_util_pkl5.TestMPIWorld.testMProbe) ... ok testIrecv (test_util_pkl5.TestMPIWorld.testIrecv) ... ok testMProbe (test_util_pkl5.TestMPIWorld.testMProbe) ... ok testIrecv (test_util_pkl5.TestMPIWorld.testIrecv) ... ok testMProbe (test_util_pkl5.TestMPIWorld.testMProbe) ... ok testIrecv (test_util_pkl5.TestMPIWorld.testIrecv) ... ok testMProbe (test_util_pkl5.TestMPIWorld.testMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessage (test_util_pkl5.TestMPIWorld.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestMPIWorld.testMessageProbeIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessage (test_util_pkl5.TestMPIWorld.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestMPIWorld.testMessageProbeIProbe) ... ok testMessage (test_util_pkl5.TestMPIWorld.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestMPIWorld.testMessageProbeIProbe) ... ok testMessage (test_util_pkl5.TestMPIWorld.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestMPIWorld.testMessageProbeIProbe) ... ok testPingPong01 (test_util_pkl5.TestMPIWorld.testPingPong01) ... ok testPingPong01 (test_util_pkl5.TestMPIWorld.testPingPong01) ... ok testPingPong01 (test_util_pkl5.TestMPIWorld.testPingPong01) ... ok testProbe (test_util_pkl5.TestMPIWorld.testProbe) ... ok testPingPong01 (test_util_pkl5.TestMPIWorld.testPingPong01) ... ok testProbe (test_util_pkl5.TestMPIWorld.testProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProbe (test_util_pkl5.TestMPIWorld.testProbe) ... ok testProbe (test_util_pkl5.TestMPIWorld.testProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRequest (test_util_pkl5.TestMPIWorld.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestMPIWorld.testSSendAndMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRequest (test_util_pkl5.TestMPIWorld.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestMPIWorld.testSSendAndMProbe) ... ok testRequest (test_util_pkl5.TestMPIWorld.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestMPIWorld.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestMPIWorld.testSSendAndRecv) ... ok testRequest (test_util_pkl5.TestMPIWorld.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestMPIWorld.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestMPIWorld.testSSendAndRecv) ... ok testSSendAndRecv (test_util_pkl5.TestMPIWorld.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestMPIWorld.testScatterInter) ... ok testSSendAndRecv (test_util_pkl5.TestMPIWorld.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestMPIWorld.testScatterInter) ... ok testScatterInter (test_util_pkl5.TestMPIWorld.testScatterInter) ... ok testScatterInter (test_util_pkl5.TestMPIWorld.testScatterInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterIntra (test_util_pkl5.TestMPIWorld.testScatterIntra) ... ok testScatterIntra (test_util_pkl5.TestMPIWorld.testScatterIntra) ... ok testScatterIntra (test_util_pkl5.TestMPIWorld.testScatterIntra) ... ok testScatterIntra (test_util_pkl5.TestMPIWorld.testScatterIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendAndRecv (test_util_pkl5.TestMPIWorld.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestMPIWorld.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendAndRecv (test_util_pkl5.TestMPIWorld.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestMPIWorld.testSendrecv) ... ok testSendAndRecv (test_util_pkl5.TestMPIWorld.testSendAndRecv) ... ok testSendAndRecv (test_util_pkl5.TestMPIWorld.testSendAndRecv) ... ok ok testTestAll (test_util_pkl5.TestMPIWorld.testTestAll) ... ok testTestAll (test_util_pkl5.TestMPIWorld.testTestAll) ... testSendrecv (test_util_pkl5.TestMPIWorld.testSendrecv) ... ok testTestAll (test_util_pkl5.TestMPIWorld.testTestAll) ... ok testSendrecv (test_util_pkl5.TestMPIWorld.testSendrecv) ... ok testTestAll (test_util_pkl5.TestMPIWorld.testTestAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitAll (test_util_pkl5.TestMPIWorld.testWaitAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitAll (test_util_pkl5.TestMPIWorld.testWaitAll) ... ok testWaitAll (test_util_pkl5.TestMPIWorld.testWaitAll) ... ok testWaitAll (test_util_pkl5.TestMPIWorld.testWaitAll) ... ok testAllgatherInter (test_util_pkl5.TestPKL5Self.testAllgatherInter) ... skipped 'comm.size==1' testAllgatherIntra (test_util_pkl5.TestPKL5Self.testAllgatherIntra) ... ok testAllgatherInter (test_util_pkl5.TestPKL5Self.testAllgatherInter) ... skipped 'comm.size==1' testAllgatherIntra (test_util_pkl5.TestPKL5Self.testAllgatherIntra) ... ok testAllgatherInter (test_util_pkl5.TestPKL5Self.testAllgatherInter) ... skipped 'comm.size==1' testAllgatherIntra (test_util_pkl5.TestPKL5Self.testAllgatherIntra) ... ok testAllgatherInter (test_util_pkl5.TestPKL5Self.testAllgatherInter) ... skipped 'comm.size==1' testAllgatherIntra (test_util_pkl5.TestPKL5Self.testAllgatherIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallInter (test_util_pkl5.TestPKL5Self.testAlltoallInter) ... ok testAlltoallInter (test_util_pkl5.TestPKL5Self.testAlltoallInter) ... skipped 'comm.size==1' testAlltoallIntra (test_util_pkl5.TestPKL5Self.testAlltoallIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR skipped 'comm.size==1' testAlltoallIntra (test_util_pkl5.TestPKL5Self.testAlltoallIntra) ... ok testAlltoallInter (test_util_pkl5.TestPKL5Self.testAlltoallInter) ... skipped 'comm.size==1' testAlltoallIntra (test_util_pkl5.TestPKL5Self.testAlltoallIntra) ... ok testAlltoallInter (test_util_pkl5.TestPKL5Self.testAlltoallInter) ... skipped 'comm.size==1' testAlltoallIntra (test_util_pkl5.TestPKL5Self.testAlltoallIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBSendAndRecv (test_util_pkl5.TestPKL5Self.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestPKL5Self.testBcastInter) ... skipped 'comm.size==1' testBcastIntra (test_util_pkl5.TestPKL5Self.testBcastIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBSendAndRecv (test_util_pkl5.TestPKL5Self.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestPKL5Self.testBcastInter) ... skipped 'comm.size==1' testBcastIntra (test_util_pkl5.TestPKL5Self.testBcastIntra) ... ok testBSendAndRecv (test_util_pkl5.TestPKL5Self.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestPKL5Self.testBcastInter) ... skipped 'comm.size==1' testBcastIntra (test_util_pkl5.TestPKL5Self.testBcastIntra) ... ok testBigMPI (test_util_pkl5.TestPKL5Self.testBigMPI) ... skipped 'comm.size==1' testGatherInter (test_util_pkl5.TestPKL5Self.testGatherInter) ... skipped 'comm.size==1' testGatherIntra (test_util_pkl5.TestPKL5Self.testGatherIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBigMPI (test_util_pkl5.TestPKL5Self.testBigMPI) ... skipped 'comm.size==1' testGatherInter (test_util_pkl5.TestPKL5Self.testGatherInter) ... skipped 'comm.size==1' testGatherIntra (test_util_pkl5.TestPKL5Self.testGatherIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBSendAndRecv (test_util_pkl5.TestPKL5Self.testBSendAndRecv) ... ok testBcastInter (test_util_pkl5.TestPKL5Self.testBcastInter) ... skipped 'comm.size==1' testBcastIntra (test_util_pkl5.TestPKL5Self.testBcastIntra) ... ok testBigMPI (test_util_pkl5.TestPKL5Self.testBigMPI) ... skipped 'comm.size==1' testGatherInter (test_util_pkl5.TestPKL5Self.testGatherInter) ... ok testGetStatusAll (test_util_pkl5.TestPKL5Self.testGetStatusAll) ... ok testGetStatusAll (test_util_pkl5.TestPKL5Self.testGetStatusAll) ... ok testBigMPI (test_util_pkl5.TestPKL5Self.testBigMPI) ... skipped 'comm.size==1' testGatherInter (test_util_pkl5.TestPKL5Self.testGatherInter) ... skipped 'comm.size==1' testGatherIntra (test_util_pkl5.TestPKL5Self.testGatherIntra) ... skipped 'comm.size==1' testGatherIntra (test_util_pkl5.TestPKL5Self.testGatherIntra) ... ok testIBSendAndRecv (test_util_pkl5.TestPKL5Self.testIBSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIBSendAndRecv (test_util_pkl5.TestPKL5Self.testIBSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetStatusAll (test_util_pkl5.TestPKL5Self.testGetStatusAll) ... ok testGetStatusAll (test_util_pkl5.TestPKL5Self.testGetStatusAll) ... ok testIBSendAndRecv (test_util_pkl5.TestPKL5Self.testIBSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIBSendAndRecv (test_util_pkl5.TestPKL5Self.testIBSendAndRecv) ... ok testIMProbe (test_util_pkl5.TestPKL5Self.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_util_pkl5.TestPKL5Self.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_util_pkl5.TestPKL5Self.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_util_pkl5.TestPKL5Self.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_util_pkl5.TestPKL5Self.testIMProbe) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testISSendAndRecv (test_util_pkl5.TestPKL5Self.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_util_pkl5.TestPKL5Self.testISSendAndRecv) ... ok testISSendCancel (test_util_pkl5.TestPKL5Self.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestPKL5Self.testISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendCancel (test_util_pkl5.TestPKL5Self.testISSendCancel) ... ok testISSendAndRecv (test_util_pkl5.TestPKL5Self.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISendAndRecv (test_util_pkl5.TestPKL5Self.testISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendCancel (test_util_pkl5.TestPKL5Self.testISSendCancel) ... ok testIrecv (test_util_pkl5.TestPKL5Self.testIrecv) ... ok testMProbe (test_util_pkl5.TestPKL5Self.testMProbe) ... ok testISSendCancel (test_util_pkl5.TestPKL5Self.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestPKL5Self.testISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIrecv (test_util_pkl5.TestPKL5Self.testIrecv) ... ok testMProbe (test_util_pkl5.TestPKL5Self.testMProbe) ... ok testISendAndRecv (test_util_pkl5.TestPKL5Self.testISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIrecv (test_util_pkl5.TestPKL5Self.testIrecv) ... ok testMProbe (test_util_pkl5.TestPKL5Self.testMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIrecv (test_util_pkl5.TestPKL5Self.testIrecv) ... ok testMProbe (test_util_pkl5.TestPKL5Self.testMProbe) ... ok testMessage (test_util_pkl5.TestPKL5Self.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestPKL5Self.testMessageProbeIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessage (test_util_pkl5.TestPKL5Self.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestPKL5Self.testMessageProbeIProbe) ... ok testPickle5 (test_util_pkl5.TestPKL5Self.testPickle5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessage (test_util_pkl5.TestPKL5Self.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestPKL5Self.testMessageProbeIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPickle5 (test_util_pkl5.TestPKL5Self.testPickle5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessage (test_util_pkl5.TestPKL5Self.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestPKL5Self.testMessageProbeIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPickle5 (test_util_pkl5.TestPKL5Self.testPickle5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPickle5 (test_util_pkl5.TestPKL5Self.testPickle5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPingPong01 (test_util_pkl5.TestPKL5Self.testPingPong01) ... ok testProbe (test_util_pkl5.TestPKL5Self.testProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRequest (test_util_pkl5.TestPKL5Self.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestPKL5Self.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestPKL5Self.testSSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterInter (test_util_pkl5.TestPKL5Self.testScatterInter) ... skipped 'comm.size==1' testScatterIntra (test_util_pkl5.TestPKL5Self.testScatterIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPingPong01 (test_util_pkl5.TestPKL5Self.testPingPong01) ... ok testProbe (test_util_pkl5.TestPKL5Self.testProbe) ... ok testSendAndRecv (test_util_pkl5.TestPKL5Self.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestPKL5Self.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRequest (test_util_pkl5.TestPKL5Self.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestPKL5Self.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestPKL5Self.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestPKL5Self.testScatterInter) ... skipped 'comm.size==1' testScatterIntra (test_util_pkl5.TestPKL5Self.testScatterIntra) ... ok testPingPong01 (test_util_pkl5.TestPKL5Self.testPingPong01) ... ok testProbe (test_util_pkl5.TestPKL5Self.testProbe) ... ok testTestAll (test_util_pkl5.TestPKL5Self.testTestAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRequest (test_util_pkl5.TestPKL5Self.testRequest) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitAll (test_util_pkl5.TestPKL5Self.testWaitAll) ... testSSendAndMProbe (test_util_pkl5.TestPKL5Self.testSSendAndMProbe) ... ok testSendAndRecv (test_util_pkl5.TestPKL5Self.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestPKL5Self.testSendrecv) ... ok testPingPong01 (test_util_pkl5.TestPKL5Self.testPingPong01) ... ok testProbe (test_util_pkl5.TestPKL5Self.testProbe) ... ok testSSendAndRecv (test_util_pkl5.TestPKL5Self.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestPKL5Self.testScatterInter) ... skipped 'comm.size==1' testScatterIntra (test_util_pkl5.TestPKL5Self.testScatterIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherInter (test_util_pkl5.TestPKL5World.testAllgatherInter) ... ok testRequest (test_util_pkl5.TestPKL5Self.testRequest) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testTestAll (test_util_pkl5.TestPKL5Self.testTestAll) ... ok testSSendAndMProbe (test_util_pkl5.TestPKL5Self.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestPKL5Self.testSSendAndRecv) ... ok testScatterInter (test_util_pkl5.TestPKL5Self.testScatterInter) ... skipped 'comm.size==1' testScatterIntra (test_util_pkl5.TestPKL5Self.testScatterIntra) ... ok testSendAndRecv (test_util_pkl5.TestPKL5Self.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestPKL5Self.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitAll (test_util_pkl5.TestPKL5Self.testWaitAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendAndRecv (test_util_pkl5.TestPKL5Self.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestPKL5Self.testSendrecv) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherInter (test_util_pkl5.TestPKL5World.testAllgatherInter) ... testTestAll (test_util_pkl5.TestPKL5Self.testTestAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitAll (test_util_pkl5.TestPKL5Self.testWaitAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testTestAll (test_util_pkl5.TestPKL5Self.testTestAll) ... ok testAllgatherInter (test_util_pkl5.TestPKL5World.testAllgatherInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitAll (test_util_pkl5.TestPKL5Self.testWaitAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherInter (test_util_pkl5.TestPKL5World.testAllgatherInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAllgatherIntra (test_util_pkl5.TestPKL5World.testAllgatherIntra) ... ok testAllgatherIntra (test_util_pkl5.TestPKL5World.testAllgatherIntra) ... ok testAllgatherIntra (test_util_pkl5.TestPKL5World.testAllgatherIntra) ... ok testAllgatherIntra (test_util_pkl5.TestPKL5World.testAllgatherIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallInter (test_util_pkl5.TestPKL5World.testAlltoallInter) ... ok testAlltoallInter (test_util_pkl5.TestPKL5World.testAlltoallInter) ... ok testAlltoallInter (test_util_pkl5.TestPKL5World.testAlltoallInter) ... ok testAlltoallInter (test_util_pkl5.TestPKL5World.testAlltoallInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAlltoallIntra (test_util_pkl5.TestPKL5World.testAlltoallIntra) ... ok testAlltoallIntra (test_util_pkl5.TestPKL5World.testAlltoallIntra) ... ok testAlltoallIntra (test_util_pkl5.TestPKL5World.testAlltoallIntra) ... ok testAlltoallIntra (test_util_pkl5.TestPKL5World.testAlltoallIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBSendAndRecv (test_util_pkl5.TestPKL5World.testBSendAndRecv) ... ok testBSendAndRecv (test_util_pkl5.TestPKL5World.testBSendAndRecv) ... ok testBSendAndRecv (test_util_pkl5.TestPKL5World.testBSendAndRecv) ... ok testBSendAndRecv (test_util_pkl5.TestPKL5World.testBSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastInter (test_util_pkl5.TestPKL5World.testBcastInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBcastIntra (test_util_pkl5.TestPKL5World.testBcastIntra) ... ok testBcastInter (test_util_pkl5.TestPKL5World.testBcastInter) ... ok testBcastIntra (test_util_pkl5.TestPKL5World.testBcastIntra) ... ok testBcastInter (test_util_pkl5.TestPKL5World.testBcastInter) ... ok testBcastIntra (test_util_pkl5.TestPKL5World.testBcastIntra) ... ok testBcastInter (test_util_pkl5.TestPKL5World.testBcastInter) ... ok testBcastIntra (test_util_pkl5.TestPKL5World.testBcastIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBigMPI (test_util_pkl5.TestPKL5World.testBigMPI) ... ok testBigMPI (test_util_pkl5.TestPKL5World.testBigMPI) ... ok testBigMPI (test_util_pkl5.TestPKL5World.testBigMPI) ... ok testBigMPI (test_util_pkl5.TestPKL5World.testBigMPI) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherInter (test_util_pkl5.TestPKL5World.testGatherInter) ... ok testGatherInter (test_util_pkl5.TestPKL5World.testGatherInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherInter (test_util_pkl5.TestPKL5World.testGatherInter) ... ok testGatherInter (test_util_pkl5.TestPKL5World.testGatherInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGatherIntra (test_util_pkl5.TestPKL5World.testGatherIntra) ... ok testGatherIntra (test_util_pkl5.TestPKL5World.testGatherIntra) ... ok testGatherIntra (test_util_pkl5.TestPKL5World.testGatherIntra) ... ok testGatherIntra (test_util_pkl5.TestPKL5World.testGatherIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetStatusAll (test_util_pkl5.TestPKL5World.testGetStatusAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetStatusAll (test_util_pkl5.TestPKL5World.testGetStatusAll) ... ok testGetStatusAll (test_util_pkl5.TestPKL5World.testGetStatusAll) ... ok testGetStatusAll (test_util_pkl5.TestPKL5World.testGetStatusAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIBSendAndRecv (test_util_pkl5.TestPKL5World.testIBSendAndRecv) ... ok testIBSendAndRecv (test_util_pkl5.TestPKL5World.testIBSendAndRecv) ... ok testIBSendAndRecv (test_util_pkl5.TestPKL5World.testIBSendAndRecv) ... ok testIBSendAndRecv (test_util_pkl5.TestPKL5World.testIBSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIMProbe (test_util_pkl5.TestPKL5World.testIMProbe) ... ok testIMProbe (test_util_pkl5.TestPKL5World.testIMProbe) ... ok testIMProbe (test_util_pkl5.TestPKL5World.testIMProbe) ... ok testIMProbe (test_util_pkl5.TestPKL5World.testIMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_util_pkl5.TestPKL5World.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendAndRecv (test_util_pkl5.TestPKL5World.testISSendAndRecv) ... ok testISSendAndRecv (test_util_pkl5.TestPKL5World.testISSendAndRecv) ... ok testISSendAndRecv (test_util_pkl5.TestPKL5World.testISSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendCancel (test_util_pkl5.TestPKL5World.testISSendCancel) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testISSendCancel (test_util_pkl5.TestPKL5World.testISSendCancel) ... ok testISSendCancel (test_util_pkl5.TestPKL5World.testISSendCancel) ... ok testISSendCancel (test_util_pkl5.TestPKL5World.testISSendCancel) ... ok testISendAndRecv (test_util_pkl5.TestPKL5World.testISendAndRecv) ... ok testISendAndRecv (test_util_pkl5.TestPKL5World.testISendAndRecv) ... ok testISendAndRecv (test_util_pkl5.TestPKL5World.testISendAndRecv) ... ok testISendAndRecv (test_util_pkl5.TestPKL5World.testISendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIrecv (test_util_pkl5.TestPKL5World.testIrecv) ... ok testMProbe (test_util_pkl5.TestPKL5World.testMProbe) ... ok testIrecv (test_util_pkl5.TestPKL5World.testIrecv) ... ok testMProbe (test_util_pkl5.TestPKL5World.testMProbe) ... ok testIrecv (test_util_pkl5.TestPKL5World.testIrecv) ... ok testMProbe (test_util_pkl5.TestPKL5World.testMProbe) ... ok testIrecv (test_util_pkl5.TestPKL5World.testIrecv) ... ok testMProbe (test_util_pkl5.TestPKL5World.testMProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessage (test_util_pkl5.TestPKL5World.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestPKL5World.testMessageProbeIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMessage (test_util_pkl5.TestPKL5World.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestPKL5World.testMessageProbeIProbe) ... ok testMessage (test_util_pkl5.TestPKL5World.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestPKL5World.testMessageProbeIProbe) ... ok testMessage (test_util_pkl5.TestPKL5World.testMessage) ... ok testMessageProbeIProbe (test_util_pkl5.TestPKL5World.testMessageProbeIProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPickle5 (test_util_pkl5.TestPKL5World.testPickle5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPickle5 (test_util_pkl5.TestPKL5World.testPickle5) ... ok testPickle5 (test_util_pkl5.TestPKL5World.testPickle5) ... ok testPickle5 (test_util_pkl5.TestPKL5World.testPickle5) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPingPong01 (test_util_pkl5.TestPKL5World.testPingPong01) ... ok testProbe (test_util_pkl5.TestPKL5World.testProbe) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testPingPong01 (test_util_pkl5.TestPKL5World.testPingPong01) ... ok testPingPong01 (test_util_pkl5.TestPKL5World.testPingPong01) ... ok testProbe (test_util_pkl5.TestPKL5World.testProbe) ... ok testPingPong01 (test_util_pkl5.TestPKL5World.testPingPong01) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testProbe (test_util_pkl5.TestPKL5World.testProbe) ... ok testProbe (test_util_pkl5.TestPKL5World.testProbe) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRequest (test_util_pkl5.TestPKL5World.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestPKL5World.testSSendAndMProbe) ... ok testRequest (test_util_pkl5.TestPKL5World.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestPKL5World.testSSendAndMProbe) ... ok testRequest (test_util_pkl5.TestPKL5World.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestPKL5World.testSSendAndMProbe) ... ok ok testRequest (test_util_pkl5.TestPKL5World.testRequest) ... ok testSSendAndMProbe (test_util_pkl5.TestPKL5World.testSSendAndMProbe) ... ok testSSendAndRecv (test_util_pkl5.TestPKL5World.testSSendAndRecv) ... testSSendAndRecv (test_util_pkl5.TestPKL5World.testSSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSSendAndRecv (test_util_pkl5.TestPKL5World.testSSendAndRecv) ... ok testSSendAndRecv (test_util_pkl5.TestPKL5World.testSSendAndRecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterInter (test_util_pkl5.TestPKL5World.testScatterInter) ... ok testScatterInter (test_util_pkl5.TestPKL5World.testScatterInter) ... ok testScatterInter (test_util_pkl5.TestPKL5World.testScatterInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testScatterInter (test_util_pkl5.TestPKL5World.testScatterInter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testScatterIntra (test_util_pkl5.TestPKL5World.testScatterIntra) ... ok testScatterIntra (test_util_pkl5.TestPKL5World.testScatterIntra) ... ok testScatterIntra (test_util_pkl5.TestPKL5World.testScatterIntra) ... testScatterIntra (test_util_pkl5.TestPKL5World.testScatterIntra) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendAndRecv (test_util_pkl5.TestPKL5World.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestPKL5World.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSendAndRecv (test_util_pkl5.TestPKL5World.testSendAndRecv) ... ok testSendAndRecv (test_util_pkl5.TestPKL5World.testSendAndRecv) ... ok testSendAndRecv (test_util_pkl5.TestPKL5World.testSendAndRecv) ... ok testSendrecv (test_util_pkl5.TestPKL5World.testSendrecv) ... ok testSendrecv (test_util_pkl5.TestPKL5World.testSendrecv) ... ok testSendrecv (test_util_pkl5.TestPKL5World.testSendrecv) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testTestAll (test_util_pkl5.TestPKL5World.testTestAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testTestAll (test_util_pkl5.TestPKL5World.testTestAll) ... ok testTestAll (test_util_pkl5.TestPKL5World.testTestAll) ... ok testTestAll (test_util_pkl5.TestPKL5World.testTestAll) ... ok testWaitAll (test_util_pkl5.TestPKL5World.testWaitAll) ... ok testWaitAll (test_util_pkl5.TestPKL5World.testWaitAll) ... ok testWaitAll (test_util_pkl5.TestPKL5World.testWaitAll) ... ok testWaitAll (test_util_pkl5.TestPKL5World.testWaitAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_apply (test_util_pool.TestProcessPool.test_apply) ... skipped 'mpi-world-size>1' test_apply_async (test_util_pool.TestProcessPool.test_apply_async) ... skipped 'mpi-world-size>1' test_apply_async_timeout (test_util_pool.TestProcessPool.test_apply_async_timeout) ... skipped 'mpi-world-size>1' test_arg_initializer (test_util_pool.TestProcessPool.test_arg_initializer) ... skipped 'mpi-world-size>1' test_arg_processes (test_util_pool.TestProcessPool.test_arg_processes) ... skipped 'mpi-world-size>1' test_async_error_callback (test_util_pool.TestProcessPool.test_async_error_callback) ... skipped 'mpi-world-size>1' test_empty_iterable (test_util_pool.TestProcessPool.test_empty_iterable) ... skipped 'mpi-world-size>1' test_enter_exit (test_util_pool.TestProcessPool.test_enter_exit) ... skipped 'mpi-world-size>1' test_imap (test_util_pool.TestProcessPool.test_imap) ... skipped 'mpi-world-size>1' test_imap_unordered (test_util_pool.TestProcessPool.test_imap_unordered) ... skipped 'mpi-world-size>1' test_istarmap (test_util_pool.TestProcessPool.test_istarmap) ... skipped 'mpi-world-size>1' test_istarmap_unordered (test_util_pool.TestProcessPool.test_istarmap_unordered) ... skipped 'mpi-world-size>1' test_map (test_util_pool.TestProcessPool.test_map) ... skipped 'mpi-world-size>1' test_map_async (test_util_pool.TestProcessPool.test_map_async) ... skipped 'mpi-world-size>1' test_map_async_callbacks (test_util_pool.TestProcessPool.test_map_async_callbacks) ... skipped 'mpi-world-size>1' test_pool_worker_lifetime_early_close (test_util_pool.TestProcessPool.test_pool_worker_lifetime_early_close) ... skipped 'mpi-world-size>1' test_starmap (test_util_pool.TestProcessPool.test_starmap) ... skipped 'mpi-world-size>1' test_starmap_async (test_util_pool.TestProcessPool.test_starmap_async) ... skipped 'mpi-world-size>1' test_terminate (test_util_pool.TestProcessPool.test_terminate) ... skipped 'mpi-world-size>1' test_unsupported_args (test_util_pool.TestProcessPool.test_unsupported_args) ... skipped 'mpi-world-size>1' test_apply (test_util_pool.TestThreadPool.test_apply) ... ok test_apply (test_util_pool.TestProcessPool.test_apply) ... skipped 'mpi-world-size>1' test_apply_async (test_util_pool.TestProcessPool.test_apply_async) ... skipped 'mpi-world-size>1' test_apply_async_timeout (test_util_pool.TestProcessPool.test_apply_async_timeout) ... skipped 'mpi-world-size>1' test_arg_initializer (test_util_pool.TestProcessPool.test_arg_initializer) ... skipped 'mpi-world-size>1' test_arg_processes (test_util_pool.TestProcessPool.test_arg_processes) ... skipped 'mpi-world-size>1' test_async_error_callback (test_util_pool.TestProcessPool.test_async_error_callback) ... skipped 'mpi-world-size>1' test_empty_iterable (test_util_pool.TestProcessPool.test_empty_iterable) ... skipped 'mpi-world-size>1' test_enter_exit (test_util_pool.TestProcessPool.test_enter_exit) ... skipped 'mpi-world-size>1' test_imap (test_util_pool.TestProcessPool.test_imap) ... skipped 'mpi-world-size>1' test_imap_unordered (test_util_pool.TestProcessPool.test_imap_unordered) ... skipped 'mpi-world-size>1' test_istarmap (test_util_pool.TestProcessPool.test_istarmap) ... skipped 'mpi-world-size>1' test_istarmap_unordered (test_util_pool.TestProcessPool.test_istarmap_unordered) ... skipped 'mpi-world-size>1' test_map (test_util_pool.TestProcessPool.test_map) ... skipped 'mpi-world-size>1' test_map_async (test_util_pool.TestProcessPool.test_map_async) ... skipped 'mpi-world-size>1' test_map_async_callbacks (test_util_pool.TestProcessPool.test_map_async_callbacks) ... skipped 'mpi-world-size>1' test_pool_worker_lifetime_early_close (test_util_pool.TestProcessPool.test_pool_worker_lifetime_early_close) ... skipped 'mpi-world-size>1' test_starmap (test_util_pool.TestProcessPool.test_starmap) ... skipped 'mpi-world-size>1' test_starmap_async (test_util_pool.TestProcessPool.test_starmap_async) ... skipped 'mpi-world-size>1' test_terminate (test_util_pool.TestProcessPool.test_terminate) ... skipped 'mpi-world-size>1' test_unsupported_args (test_util_pool.TestProcessPool.test_unsupported_args) ... skipped 'mpi-world-size>1' test_apply (test_util_pool.TestThreadPool.test_apply) ... ok test_apply (test_util_pool.TestProcessPool.test_apply) ... skipped 'mpi-world-size>1' test_apply_async (test_util_pool.TestProcessPool.test_apply_async) ... skipped 'mpi-world-size>1' test_apply_async_timeout (test_util_pool.TestProcessPool.test_apply_async_timeout) ... skipped 'mpi-world-size>1' test_arg_initializer (test_util_pool.TestProcessPool.test_arg_initializer) ... skipped 'mpi-world-size>1' test_arg_processes (test_util_pool.TestProcessPool.test_arg_processes) ... skipped 'mpi-world-size>1' test_async_error_callback (test_util_pool.TestProcessPool.test_async_error_callback) ... skipped 'mpi-world-size>1' test_empty_iterable (test_util_pool.TestProcessPool.test_empty_iterable) ... skipped 'mpi-world-size>1' test_enter_exit (test_util_pool.TestProcessPool.test_enter_exit) ... skipped 'mpi-world-size>1' test_imap (test_util_pool.TestProcessPool.test_imap) ... skipped 'mpi-world-size>1' test_imap_unordered (test_util_pool.TestProcessPool.test_imap_unordered) ... skipped 'mpi-world-size>1' test_istarmap (test_util_pool.TestProcessPool.test_istarmap) ... skipped 'mpi-world-size>1' test_istarmap_unordered (test_util_pool.TestProcessPool.test_istarmap_unordered) ... skipped 'mpi-world-size>1' test_map (test_util_pool.TestProcessPool.test_map) ... skipped 'mpi-world-size>1' test_map_async (test_util_pool.TestProcessPool.test_map_async) ... skipped 'mpi-world-size>1' test_map_async_callbacks (test_util_pool.TestProcessPool.test_map_async_callbacks) ... skipped 'mpi-world-size>1' test_pool_worker_lifetime_early_close (test_util_pool.TestProcessPool.test_pool_worker_lifetime_early_close) ... skipped 'mpi-world-size>1' test_starmap (test_util_pool.TestProcessPool.test_starmap) ... skipped 'mpi-world-size>1' test_starmap_async (test_util_pool.TestProcessPool.test_starmap_async) ... skipped 'mpi-world-size>1' test_terminate (test_util_pool.TestProcessPool.test_terminate) ... skipped 'mpi-world-size>1' test_unsupported_args (test_util_pool.TestProcessPool.test_unsupported_args) ... skipped 'mpi-world-size>1' test_apply (test_util_pool.TestThreadPool.test_apply) ... ok test_apply (test_util_pool.TestProcessPool.test_apply) ... skipped 'mpi-world-size>1' test_apply_async (test_util_pool.TestProcessPool.test_apply_async) ... skipped 'mpi-world-size>1' test_apply_async_timeout (test_util_pool.TestProcessPool.test_apply_async_timeout) ... skipped 'mpi-world-size>1' test_arg_initializer (test_util_pool.TestProcessPool.test_arg_initializer) ... skipped 'mpi-world-size>1' test_arg_processes (test_util_pool.TestProcessPool.test_arg_processes) ... skipped 'mpi-world-size>1' test_async_error_callback (test_util_pool.TestProcessPool.test_async_error_callback) ... skipped 'mpi-world-size>1' test_empty_iterable (test_util_pool.TestProcessPool.test_empty_iterable) ... skipped 'mpi-world-size>1' test_enter_exit (test_util_pool.TestProcessPool.test_enter_exit) ... skipped 'mpi-world-size>1' test_imap (test_util_pool.TestProcessPool.test_imap) ... skipped 'mpi-world-size>1' test_imap_unordered (test_util_pool.TestProcessPool.test_imap_unordered) ... skipped 'mpi-world-size>1' test_istarmap (test_util_pool.TestProcessPool.test_istarmap) ... skipped 'mpi-world-size>1' test_istarmap_unordered (test_util_pool.TestProcessPool.test_istarmap_unordered) ... skipped 'mpi-world-size>1' test_map (test_util_pool.TestProcessPool.test_map) ... skipped 'mpi-world-size>1' test_map_async (test_util_pool.TestProcessPool.test_map_async) ... skipped 'mpi-world-size>1' test_map_async_callbacks (test_util_pool.TestProcessPool.test_map_async_callbacks) ... skipped 'mpi-world-size>1' test_pool_worker_lifetime_early_close (test_util_pool.TestProcessPool.test_pool_worker_lifetime_early_close) ... skipped 'mpi-world-size>1' test_starmap (test_util_pool.TestProcessPool.test_starmap) ... skipped 'mpi-world-size>1' test_starmap_async (test_util_pool.TestProcessPool.test_starmap_async) ... skipped 'mpi-world-size>1' test_terminate (test_util_pool.TestProcessPool.test_terminate) ... skipped 'mpi-world-size>1' test_unsupported_args (test_util_pool.TestProcessPool.test_unsupported_args) ... skipped 'mpi-world-size>1' test_apply (test_util_pool.TestThreadPool.test_apply) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_apply_async (test_util_pool.TestThreadPool.test_apply_async) ... ok test_apply_async (test_util_pool.TestThreadPool.test_apply_async) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_apply_async (test_util_pool.TestThreadPool.test_apply_async) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_apply_async (test_util_pool.TestThreadPool.test_apply_async) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok test_apply_async_timeout (test_util_pool.TestThreadPool.test_apply_async_timeout) ... ok test_apply_async_timeout (test_util_pool.TestThreadPool.test_apply_async_timeout) ... test_apply_async_timeout (test_util_pool.TestThreadPool.test_apply_async_timeout) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_apply_async_timeout (test_util_pool.TestThreadPool.test_apply_async_timeout) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_arg_initializer (test_util_pool.TestThreadPool.test_arg_initializer) ... ok test_arg_processes (test_util_pool.TestThreadPool.test_arg_processes) ... ok test_async_error_callback (test_util_pool.TestThreadPool.test_async_error_callback) ... ok test_arg_initializer (test_util_pool.TestThreadPool.test_arg_initializer) ... ok test_arg_processes (test_util_pool.TestThreadPool.test_arg_processes) ... ok test_async_error_callback (test_util_pool.TestThreadPool.test_async_error_callback) ... ok test_arg_initializer (test_util_pool.TestThreadPool.test_arg_initializer) ... ok test_arg_processes (test_util_pool.TestThreadPool.test_arg_processes) ... ok test_async_error_callback (test_util_pool.TestThreadPool.test_async_error_callback) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_arg_initializer (test_util_pool.TestThreadPool.test_arg_initializer) ... ok test_arg_processes (test_util_pool.TestThreadPool.test_arg_processes) ... ok ok test_async_error_callback (test_util_pool.TestThreadPool.test_async_error_callback) ... ok test_empty_iterable (test_util_pool.TestThreadPool.test_empty_iterable) ... test_empty_iterable (test_util_pool.TestThreadPool.test_empty_iterable) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_empty_iterable (test_util_pool.TestThreadPool.test_empty_iterable) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_empty_iterable (test_util_pool.TestThreadPool.test_empty_iterable) ... ok test_enter_exit (test_util_pool.TestThreadPool.test_enter_exit) ... ok test_imap (test_util_pool.TestThreadPool.test_imap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_enter_exit (test_util_pool.TestThreadPool.test_enter_exit) ... ok test_imap (test_util_pool.TestThreadPool.test_imap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_enter_exit (test_util_pool.TestThreadPool.test_enter_exit) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_imap (test_util_pool.TestThreadPool.test_imap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_enter_exit (test_util_pool.TestThreadPool.test_enter_exit) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_imap (test_util_pool.TestThreadPool.test_imap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_imap_unordered (test_util_pool.TestThreadPool.test_imap_unordered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_imap_unordered (test_util_pool.TestThreadPool.test_imap_unordered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_imap_unordered (test_util_pool.TestThreadPool.test_imap_unordered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_imap_unordered (test_util_pool.TestThreadPool.test_imap_unordered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok test_istarmap (test_util_pool.TestThreadPool.test_istarmap) ... test_istarmap (test_util_pool.TestThreadPool.test_istarmap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_istarmap_unordered (test_util_pool.TestThreadPool.test_istarmap_unordered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_istarmap_unordered (test_util_pool.TestThreadPool.test_istarmap_unordered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_map (test_util_pool.TestThreadPool.test_map) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_map (test_util_pool.TestThreadPool.test_map) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_istarmap (test_util_pool.TestThreadPool.test_istarmap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_istarmap (test_util_pool.TestThreadPool.test_istarmap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_istarmap_unordered (test_util_pool.TestThreadPool.test_istarmap_unordered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_istarmap_unordered (test_util_pool.TestThreadPool.test_istarmap_unordered) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_map (test_util_pool.TestThreadPool.test_map) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_map (test_util_pool.TestThreadPool.test_map) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_map_async (test_util_pool.TestThreadPool.test_map_async) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_map_async (test_util_pool.TestThreadPool.test_map_async) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_map_async (test_util_pool.TestThreadPool.test_map_async) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_map_async (test_util_pool.TestThreadPool.test_map_async) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_map_async_callbacks (test_util_pool.TestThreadPool.test_map_async_callbacks) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_map_async_callbacks (test_util_pool.TestThreadPool.test_map_async_callbacks) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_pool_worker_lifetime_early_close (test_util_pool.TestThreadPool.test_pool_worker_lifetime_early_close) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_pool_worker_lifetime_early_close (test_util_pool.TestThreadPool.test_pool_worker_lifetime_early_close) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_map_async_callbacks (test_util_pool.TestThreadPool.test_map_async_callbacks) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_map_async_callbacks (test_util_pool.TestThreadPool.test_map_async_callbacks) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_pool_worker_lifetime_early_close (test_util_pool.TestThreadPool.test_pool_worker_lifetime_early_close) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_pool_worker_lifetime_early_close (test_util_pool.TestThreadPool.test_pool_worker_lifetime_early_close) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_starmap (test_util_pool.TestThreadPool.test_starmap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_starmap_async (test_util_pool.TestThreadPool.test_starmap_async) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_starmap (test_util_pool.TestThreadPool.test_starmap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_starmap_async (test_util_pool.TestThreadPool.test_starmap_async) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_terminate (test_util_pool.TestThreadPool.test_terminate) ... ok test_starmap (test_util_pool.TestThreadPool.test_starmap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_starmap_async (test_util_pool.TestThreadPool.test_starmap_async) ... ok test_terminate (test_util_pool.TestThreadPool.test_terminate) ... ok test_starmap (test_util_pool.TestThreadPool.test_starmap) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_starmap_async (test_util_pool.TestThreadPool.test_starmap_async) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_terminate (test_util_pool.TestThreadPool.test_terminate) ... ok test_terminate (test_util_pool.TestThreadPool.test_terminate) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR test_unsupported_args (test_util_pool.TestThreadPool.test_unsupported_args) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireFree (test_util_sync.TestConditionMutexSelf.testAcquireFree) ... ok testFree (test_util_sync.TestConditionMutexSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitForNotify (test_util_sync.TestConditionMutexSelf.testWaitForNotify) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitNotify (test_util_sync.TestConditionMutexSelf.testWaitNotify) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitNotifyAll (test_util_sync.TestConditionMutexSelf.testWaitNotifyAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireFree (test_util_sync.TestConditionMutexWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_unsupported_args (test_util_pool.TestThreadPool.test_unsupported_args) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireFree (test_util_sync.TestConditionMutexSelf.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestConditionMutexSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitForNotify (test_util_sync.TestConditionMutexSelf.testWaitForNotify) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitNotify (test_util_sync.TestConditionMutexSelf.testWaitNotify) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitNotifyAll (test_util_sync.TestConditionMutexSelf.testWaitNotifyAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireFree (test_util_sync.TestConditionMutexWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_unsupported_args (test_util_pool.TestThreadPool.test_unsupported_args) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireFree (test_util_sync.TestConditionMutexSelf.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestConditionMutexSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitForNotify (test_util_sync.TestConditionMutexSelf.testWaitForNotify) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitNotify (test_util_sync.TestConditionMutexSelf.testWaitNotify) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitNotifyAll (test_util_sync.TestConditionMutexSelf.testWaitNotifyAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireFree (test_util_sync.TestConditionMutexWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok test_unsupported_args (test_util_pool.TestThreadPool.test_unsupported_args) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireFree (test_util_sync.TestConditionMutexSelf.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestConditionMutexSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitForNotify (test_util_sync.TestConditionMutexSelf.testWaitForNotify) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitNotify (test_util_sync.TestConditionMutexSelf.testWaitNotify) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitNotifyAll (test_util_sync.TestConditionMutexSelf.testWaitNotifyAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireFree (test_util_sync.TestConditionMutexWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestConditionMutexWorld.testFree) ... ok testFree (test_util_sync.TestConditionMutexWorld.testFree) ... ok testFree (test_util_sync.TestConditionMutexWorld.testFree) ... ok testFree (test_util_sync.TestConditionMutexWorld.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitForNotify (test_util_sync.TestConditionMutexWorld.testWaitForNotify) ... ok testWaitForNotify (test_util_sync.TestConditionMutexWorld.testWaitForNotify) ... ok testWaitForNotify (test_util_sync.TestConditionMutexWorld.testWaitForNotify) ... ok testWaitForNotify (test_util_sync.TestConditionMutexWorld.testWaitForNotify) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitNotify (test_util_sync.TestConditionMutexWorld.testWaitNotify) ... ok testWaitNotify (test_util_sync.TestConditionMutexWorld.testWaitNotify) ... ok testWaitNotify (test_util_sync.TestConditionMutexWorld.testWaitNotify) ... ok testWaitNotify (test_util_sync.TestConditionMutexWorld.testWaitNotify) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitNotifyAll (test_util_sync.TestConditionMutexWorld.testWaitNotifyAll) ... ok testWaitNotifyAll (test_util_sync.TestConditionMutexWorld.testWaitNotifyAll) ... ok testWaitNotifyAll (test_util_sync.TestConditionMutexWorld.testWaitNotifyAll) ... ok testWaitNotifyAll (test_util_sync.TestConditionMutexWorld.testWaitNotifyAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireFree (test_util_sync.TestConditionSelf.testAcquireFree) ... ok testFree (test_util_sync.TestConditionSelf.testFree) ... ok testWaitForNotify (test_util_sync.TestConditionSelf.testWaitForNotify) ... ok testWaitNotify (test_util_sync.TestConditionSelf.testWaitNotify) ... ok testWaitNotifyAll (test_util_sync.TestConditionSelf.testWaitNotifyAll) ... ok testAcquireFree (test_util_sync.TestConditionWorld.testAcquireFree) ... ok testAcquireFree (test_util_sync.TestConditionSelf.testAcquireFree) ... ok testFree (test_util_sync.TestConditionSelf.testFree) ... ok testWaitForNotify (test_util_sync.TestConditionSelf.testWaitForNotify) ... ok testWaitNotify (test_util_sync.TestConditionSelf.testWaitNotify) ... ok testWaitNotifyAll (test_util_sync.TestConditionSelf.testWaitNotifyAll) ... ok testAcquireFree (test_util_sync.TestConditionWorld.testAcquireFree) ... ok testAcquireFree (test_util_sync.TestConditionSelf.testAcquireFree) ... ok testFree (test_util_sync.TestConditionSelf.testFree) ... ok testWaitForNotify (test_util_sync.TestConditionSelf.testWaitForNotify) ... ok testWaitNotify (test_util_sync.TestConditionSelf.testWaitNotify) ... ok testWaitNotifyAll (test_util_sync.TestConditionSelf.testWaitNotifyAll) ... ok testAcquireFree (test_util_sync.TestConditionWorld.testAcquireFree) ... ok testAcquireFree (test_util_sync.TestConditionSelf.testAcquireFree) ... ok testFree (test_util_sync.TestConditionSelf.testFree) ... ok testWaitForNotify (test_util_sync.TestConditionSelf.testWaitForNotify) ... ok testWaitNotify (test_util_sync.TestConditionSelf.testWaitNotify) ... ok testWaitNotifyAll (test_util_sync.TestConditionSelf.testWaitNotifyAll) ... ok testAcquireFree (test_util_sync.TestConditionWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestConditionWorld.testFree) ... ok testFree (test_util_sync.TestConditionWorld.testFree) ... ok testFree (test_util_sync.TestConditionWorld.testFree) ... ok testFree (test_util_sync.TestConditionWorld.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitForNotify (test_util_sync.TestConditionWorld.testWaitForNotify) ... ok testWaitForNotify (test_util_sync.TestConditionWorld.testWaitForNotify) ... ok testWaitForNotify (test_util_sync.TestConditionWorld.testWaitForNotify) ... ok testWaitForNotify (test_util_sync.TestConditionWorld.testWaitForNotify) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitNotify (test_util_sync.TestConditionWorld.testWaitNotify) ... ok testWaitNotify (test_util_sync.TestConditionWorld.testWaitNotify) ... ok testWaitNotify (test_util_sync.TestConditionWorld.testWaitNotify) ... ok testWaitNotify (test_util_sync.TestConditionWorld.testWaitNotify) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWaitNotifyAll (test_util_sync.TestConditionWorld.testWaitNotifyAll) ... ok testWaitNotifyAll (test_util_sync.TestConditionWorld.testWaitNotifyAll) ... ok testWaitNotifyAll (test_util_sync.TestConditionWorld.testWaitNotifyAll) ... ok testWaitNotifyAll (test_util_sync.TestConditionWorld.testWaitNotifyAll) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testDefault (test_util_sync.TestCounterSelf.testDefault) ... ok testDefault (test_util_sync.TestCounterSelf.testDefault) ... ok testDefault (test_util_sync.TestCounterSelf.testDefault) ... ok testDefault (test_util_sync.TestCounterSelf.testDefault) ... ok ok testFree (test_util_sync.TestCounterSelf.testFree) ... testFree (test_util_sync.TestCounterSelf.testFree) ... ok testFree (test_util_sync.TestCounterSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testIter (test_util_sync.TestCounterSelf.testIter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] ok testIter (test_util_sync.TestCounterSelf.testIter) ... ok testFree (test_util_sync.TestCounterSelf.testFree) ... ok testIter (test_util_sync.TestCounterSelf.testIter) ... Sending upstream hdr.cmd = CMD_STDERR ok testIter (test_util_sync.TestCounterSelf.testIter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNext (test_util_sync.TestCounterSelf.testNext) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNext (test_util_sync.TestCounterSelf.testNext) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNext (test_util_sync.TestCounterSelf.testNext) ... ok testRoot (test_util_sync.TestCounterSelf.testRoot) ... ok testStart (test_util_sync.TestCounterSelf.testStart) ... ok testStep (test_util_sync.TestCounterSelf.testStep) ... ok testTypechar (test_util_sync.TestCounterSelf.testTypechar) ... ok testRoot (test_util_sync.TestCounterSelf.testRoot) ... ok testStart (test_util_sync.TestCounterSelf.testStart) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testStep (test_util_sync.TestCounterSelf.testStep) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testTypechar (test_util_sync.TestCounterSelf.testTypechar) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNext (test_util_sync.TestCounterSelf.testNext) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testDefault (test_util_sync.TestCounterWorld.testDefault) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testDefault (test_util_sync.TestCounterWorld.testDefault) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testRoot (test_util_sync.TestCounterSelf.testRoot) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStart (test_util_sync.TestCounterSelf.testStart) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStep (test_util_sync.TestCounterSelf.testStep) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testTypechar (test_util_sync.TestCounterSelf.testTypechar) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testDefault (test_util_sync.TestCounterWorld.testDefault) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRoot (test_util_sync.TestCounterSelf.testRoot) ... ok testStart (test_util_sync.TestCounterSelf.testStart) ... ok testStep (test_util_sync.TestCounterSelf.testStep) ... ok testTypechar (test_util_sync.TestCounterSelf.testTypechar) ... ok testDefault (test_util_sync.TestCounterWorld.testDefault) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestCounterWorld.testFree) ... ok testIter (test_util_sync.TestCounterWorld.testIter) ... ok testFree (test_util_sync.TestCounterWorld.testFree) ... ok testIter (test_util_sync.TestCounterWorld.testIter) ... ok testFree (test_util_sync.TestCounterWorld.testFree) ... ok testIter (test_util_sync.TestCounterWorld.testIter) ... ok testFree (test_util_sync.TestCounterWorld.testFree) ... ok testIter (test_util_sync.TestCounterWorld.testIter) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testNext (test_util_sync.TestCounterWorld.testNext) ... ok testNext (test_util_sync.TestCounterWorld.testNext) ... ok testNext (test_util_sync.TestCounterWorld.testNext) ... ok testNext (test_util_sync.TestCounterWorld.testNext) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testRoot (test_util_sync.TestCounterWorld.testRoot) ... ok testRoot (test_util_sync.TestCounterWorld.testRoot) ... ok testRoot (test_util_sync.TestCounterWorld.testRoot) ... ok testRoot (test_util_sync.TestCounterWorld.testRoot) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testStart (test_util_sync.TestCounterWorld.testStart) ... ok testStep (test_util_sync.TestCounterWorld.testStep) ... ok testTypechar (test_util_sync.TestCounterWorld.testTypechar) ... ok testStart (test_util_sync.TestCounterWorld.testStart) ... ok testStep (test_util_sync.TestCounterWorld.testStep) ... ok testTypechar (test_util_sync.TestCounterWorld.testTypechar) ... ok testStart (test_util_sync.TestCounterWorld.testStart) ... ok testStep (test_util_sync.TestCounterWorld.testStep) ... ok testTypechar (test_util_sync.TestCounterWorld.testTypechar) ... ok testStart (test_util_sync.TestCounterWorld.testStart) ... ok testStep (test_util_sync.TestCounterWorld.testStep) ... ok testTypechar (test_util_sync.TestCounterWorld.testTypechar) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireFree (test_util_sync.TestMutexBasicSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestMutexBasicSelf.testAcquireNonblocking) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireFree (test_util_sync.TestMutexBasicSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestMutexBasicSelf.testAcquireNonblocking) ... ok testAcquireFree (test_util_sync.TestMutexBasicSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestMutexBasicSelf.testAcquireNonblocking) ... ok testAcquireFree (test_util_sync.TestMutexBasicSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestMutexBasicSelf.testAcquireNonblocking) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireRelease (test_util_sync.TestMutexBasicSelf.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestMutexBasicSelf.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestMutexBasicSelf.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestMutexBasicSelf.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExclusion (test_util_sync.TestMutexBasicSelf.testExclusion) ... ok testFairness (test_util_sync.TestMutexBasicSelf.testFairness) ... ok testExclusion (test_util_sync.TestMutexBasicSelf.testExclusion) ... ok testFairness (test_util_sync.TestMutexBasicSelf.testFairness) ... ok testFree (test_util_sync.TestMutexBasicSelf.testFree) ... ok testWith (test_util_sync.TestMutexBasicSelf.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFree (test_util_sync.TestMutexBasicSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testWith (test_util_sync.TestMutexBasicSelf.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExclusion (test_util_sync.TestMutexBasicSelf.testExclusion) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFairness (test_util_sync.TestMutexBasicSelf.testFairness) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestMutexBasicSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWith (test_util_sync.TestMutexBasicSelf.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExclusion (test_util_sync.TestMutexBasicSelf.testExclusion) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFairness (test_util_sync.TestMutexBasicSelf.testFairness) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestMutexBasicSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWith (test_util_sync.TestMutexBasicSelf.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireFree (test_util_sync.TestMutexBasicWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireFree (test_util_sync.TestMutexBasicWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireFree (test_util_sync.TestMutexBasicWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireFree (test_util_sync.TestMutexBasicWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireNonblocking (test_util_sync.TestMutexBasicWorld.testAcquireNonblocking) ... ok testAcquireNonblocking (test_util_sync.TestMutexBasicWorld.testAcquireNonblocking) ... ok testAcquireNonblocking (test_util_sync.TestMutexBasicWorld.testAcquireNonblocking) ... ok testAcquireNonblocking (test_util_sync.TestMutexBasicWorld.testAcquireNonblocking) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestMutexBasicWorld.testAcquireRelease) ... ok testAcquireRelease (test_util_sync.TestMutexBasicWorld.testAcquireRelease) ... ok testAcquireRelease (test_util_sync.TestMutexBasicWorld.testAcquireRelease) ... ok testAcquireRelease (test_util_sync.TestMutexBasicWorld.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testExclusion (test_util_sync.TestMutexBasicWorld.testExclusion) ... ok testExclusion (test_util_sync.TestMutexBasicWorld.testExclusion) ... ok testExclusion (test_util_sync.TestMutexBasicWorld.testExclusion) ... ok testExclusion (test_util_sync.TestMutexBasicWorld.testExclusion) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFairness (test_util_sync.TestMutexBasicWorld.testFairness) ... ok testFairness (test_util_sync.TestMutexBasicWorld.testFairness) ... ok testFairness (test_util_sync.TestMutexBasicWorld.testFairness) ... ok testFairness (test_util_sync.TestMutexBasicWorld.testFairness) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestMutexBasicWorld.testFree) ... ok testFree (test_util_sync.TestMutexBasicWorld.testFree) ... ok testFree (test_util_sync.TestMutexBasicWorld.testFree) ... ok testFree (test_util_sync.TestMutexBasicWorld.testFree) ... ok testWith (test_util_sync.TestMutexBasicWorld.testWith) ... ok testWith (test_util_sync.TestMutexBasicWorld.testWith) ... ok testWith (test_util_sync.TestMutexBasicWorld.testWith) ... ok testWith (test_util_sync.TestMutexBasicWorld.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireFree (test_util_sync.TestMutexRecursiveSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestMutexRecursiveSelf.testAcquireNonblocking) ... ok testAcquireFree (test_util_sync.TestMutexRecursiveSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestMutexRecursiveSelf.testAcquireNonblocking) ... ok testAcquireFree (test_util_sync.TestMutexRecursiveSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestMutexRecursiveSelf.testAcquireNonblocking) ... ok testAcquireFree (test_util_sync.TestMutexRecursiveSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestMutexRecursiveSelf.testAcquireNonblocking) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireRelease (test_util_sync.TestMutexRecursiveSelf.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestMutexRecursiveSelf.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestMutexRecursiveSelf.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestMutexRecursiveSelf.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestMutexRecursiveSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWith (test_util_sync.TestMutexRecursiveSelf.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestMutexRecursiveSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWith (test_util_sync.TestMutexRecursiveSelf.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestMutexRecursiveSelf.testFree) ... ok testWith (test_util_sync.TestMutexRecursiveSelf.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestMutexRecursiveSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWith (test_util_sync.TestMutexRecursiveSelf.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireFree (test_util_sync.TestMutexRecursiveWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireFree (test_util_sync.TestMutexRecursiveWorld.testAcquireFree) ... ok testAcquireFree (test_util_sync.TestMutexRecursiveWorld.testAcquireFree) ... ok testAcquireFree (test_util_sync.TestMutexRecursiveWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireNonblocking (test_util_sync.TestMutexRecursiveWorld.testAcquireNonblocking) ... ok testAcquireNonblocking (test_util_sync.TestMutexRecursiveWorld.testAcquireNonblocking) ... ok testAcquireNonblocking (test_util_sync.TestMutexRecursiveWorld.testAcquireNonblocking) ... ok testAcquireNonblocking (test_util_sync.TestMutexRecursiveWorld.testAcquireNonblocking) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestMutexRecursiveWorld.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestMutexRecursiveWorld.testAcquireRelease) ... ok testAcquireRelease (test_util_sync.TestMutexRecursiveWorld.testAcquireRelease) ... ok testAcquireRelease (test_util_sync.TestMutexRecursiveWorld.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestMutexRecursiveWorld.testFree) ... ok testFree (test_util_sync.TestMutexRecursiveWorld.testFree) ... ok testFree (test_util_sync.TestMutexRecursiveWorld.testFree) ... ok testFree (test_util_sync.TestMutexRecursiveWorld.testFree) ... ok testWith (test_util_sync.TestMutexRecursiveWorld.testWith) ... ok testWith (test_util_sync.TestMutexRecursiveWorld.testWith) ... ok testWith (test_util_sync.TestMutexRecursiveWorld.testWith) ... ok testWith (test_util_sync.TestMutexRecursiveWorld.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireFree (test_util_sync.TestSemaphoreSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestSemaphoreSelf.testAcquireNonblocking) ... ok testAcquireFree (test_util_sync.TestSemaphoreSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestSemaphoreSelf.testAcquireNonblocking) ... ok testAcquireFree (test_util_sync.TestSemaphoreSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestSemaphoreSelf.testAcquireNonblocking) ... ok testAcquireFree (test_util_sync.TestSemaphoreSelf.testAcquireFree) ... ok testAcquireNonblocking (test_util_sync.TestSemaphoreSelf.testAcquireNonblocking) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestSemaphoreSelf.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestSemaphoreSelf.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestSemaphoreSelf.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestSemaphoreSelf.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBounded (test_util_sync.TestSemaphoreSelf.testBounded) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testFree (test_util_sync.TestSemaphoreSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testValue (test_util_sync.TestSemaphoreSelf.testValue) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWith (test_util_sync.TestSemaphoreSelf.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBounded (test_util_sync.TestSemaphoreSelf.testBounded) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestSemaphoreSelf.testFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testValue (test_util_sync.TestSemaphoreSelf.testValue) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWith (test_util_sync.TestSemaphoreSelf.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBounded (test_util_sync.TestSemaphoreSelf.testBounded) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestSemaphoreSelf.testFree) ... ok testValue (test_util_sync.TestSemaphoreSelf.testValue) ... ok testBounded (test_util_sync.TestSemaphoreSelf.testBounded) ... ok testFree (test_util_sync.TestSemaphoreSelf.testFree) ... ok testWith (test_util_sync.TestSemaphoreSelf.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testValue (test_util_sync.TestSemaphoreSelf.testValue) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testWith (test_util_sync.TestSemaphoreSelf.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAcquireFree (test_util_sync.TestSemaphoreWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireFree (test_util_sync.TestSemaphoreWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireFree (test_util_sync.TestSemaphoreWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireFree (test_util_sync.TestSemaphoreWorld.testAcquireFree) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireNonblocking (test_util_sync.TestSemaphoreWorld.testAcquireNonblocking) ... ok testAcquireNonblocking (test_util_sync.TestSemaphoreWorld.testAcquireNonblocking) ... ok testAcquireNonblocking (test_util_sync.TestSemaphoreWorld.testAcquireNonblocking) ... ok testAcquireNonblocking (test_util_sync.TestSemaphoreWorld.testAcquireNonblocking) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAcquireRelease (test_util_sync.TestSemaphoreWorld.testAcquireRelease) ... ok testAcquireRelease (test_util_sync.TestSemaphoreWorld.testAcquireRelease) ... ok testAcquireRelease (test_util_sync.TestSemaphoreWorld.testAcquireRelease) ... ok testAcquireRelease (test_util_sync.TestSemaphoreWorld.testAcquireRelease) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBounded (test_util_sync.TestSemaphoreWorld.testBounded) ... ok testBounded (test_util_sync.TestSemaphoreWorld.testBounded) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBounded (test_util_sync.TestSemaphoreWorld.testBounded) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBounded (test_util_sync.TestSemaphoreWorld.testBounded) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testFree (test_util_sync.TestSemaphoreWorld.testFree) ... ok testFree (test_util_sync.TestSemaphoreWorld.testFree) ... ok testFree (test_util_sync.TestSemaphoreWorld.testFree) ... ok testFree (test_util_sync.TestSemaphoreWorld.testFree) ... ok testValue (test_util_sync.TestSemaphoreWorld.testValue) ... ok testValue (test_util_sync.TestSemaphoreWorld.testValue) ... ok testValue (test_util_sync.TestSemaphoreWorld.testValue) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testValue (test_util_sync.TestSemaphoreWorld.testValue) ... ok testWith (test_util_sync.TestSemaphoreWorld.testWith) ... ok testWith (test_util_sync.TestSemaphoreWorld.testWith) ... ok testWith (test_util_sync.TestSemaphoreWorld.testWith) ... ok testWith (test_util_sync.TestSemaphoreWorld.testWith) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBeginEnd (test_util_sync.TestSequentialSelf.testBeginEnd) ... ok testWith (test_util_sync.TestSequentialSelf.testWith) ... ok testBeginEnd (test_util_sync.TestSequentialWorld.testBeginEnd) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testBeginEnd (test_util_sync.TestSequentialSelf.testBeginEnd) ... ok testWith (test_util_sync.TestSequentialSelf.testWith) ... ok testBeginEnd (test_util_sync.TestSequentialWorld.testBeginEnd) ... ok testBeginEnd (test_util_sync.TestSequentialSelf.testBeginEnd) ... ok testWith (test_util_sync.TestSequentialSelf.testWith) ... ok testBeginEnd (test_util_sync.TestSequentialWorld.testBeginEnd) ... ok testBeginEnd (test_util_sync.TestSequentialSelf.testBeginEnd) ... ok testWith (test_util_sync.TestSequentialSelf.testWith) ... ok testBeginEnd (test_util_sync.TestSequentialWorld.testBeginEnd) ... ok testWith (test_util_sync.TestSequentialWorld.testWith) ... ok ok testWith (test_util_sync.TestSequentialWorld.testWith) ... ok ok testWith (test_util_sync.TestSequentialWorld.testWith) ... ok ok testWith (test_util_sync.TestSequentialWorld.testWith) ... ok testAttributes (test_win.TestWinAllocateSelf.testAttributes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] ok testCreateFlavor (test_win.TestWinAllocateSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinAllocateSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinAllocateSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinAllocateSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinAllocateSelf.testGetSetName) ... ok testMemory (test_win.TestWinAllocateSelf.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateSelf.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateSelf.testPickle) ... ok testPyProps (test_win.TestWinAllocateSelf.testPyProps) ... ok testAttributes (test_win.TestWinAllocateSharedSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinAllocateSharedSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSharedSelf.testGetAttr) ... Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttributes (test_win.TestWinAllocateSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinAllocateSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinAllocateSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinAllocateSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinAllocateSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinAllocateSelf.testGetSetName) ... ok testMemory (test_win.TestWinAllocateSelf.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateSelf.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateSelf.testPickle) ... ok testPyProps (test_win.TestWinAllocateSelf.testPyProps) ... ok testAttributes (test_win.TestWinAllocateSharedSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinAllocateSharedSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSharedSelf.testGetAttr) ... testAttributes (test_win.TestWinAllocateSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinAllocateSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinAllocateSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinAllocateSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinAllocateSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinAllocateSelf.testGetSetName) ... ok testMemory (test_win.TestWinAllocateSelf.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateSelf.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateSelf.testPickle) ... ok testPyProps (test_win.TestWinAllocateSelf.testPyProps) ... ok testAttributes (test_win.TestWinAllocateSharedSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinAllocateSharedSelf.testCreateFlavor) ... ok ok testGetGroup (test_win.TestWinAllocateSharedSelf.testGetGroup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetErrhandler (test_win.TestWinAllocateSharedSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinAllocateSharedSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinAllocateSharedSelf.testGetSetName) ... ok testMemory (test_win.TestWinAllocateSharedSelf.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateSharedSelf.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateSharedSelf.testPickle) ... ok testPyProps (test_win.TestWinAllocateSharedSelf.testPyProps) ... ok testSharedQuery (test_win.TestWinAllocateSharedSelf.testSharedQuery) ... ok testAttributes (test_win.TestWinAllocateSharedWorld.testAttributes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetGroup (test_win.TestWinAllocateSharedSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinAllocateSharedSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinAllocateSharedSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinAllocateSharedSelf.testGetSetName) ... ok testMemory (test_win.TestWinAllocateSharedSelf.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateSharedSelf.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateSharedSelf.testPickle) ... ok testPyProps (test_win.TestWinAllocateSharedSelf.testPyProps) ... ok testSharedQuery (test_win.TestWinAllocateSharedSelf.testSharedQuery) ... ok testAttributes (test_win.TestWinAllocateSharedWorld.testAttributes) ... testGetAttr (test_win.TestWinAllocateSharedSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinAllocateSharedSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinAllocateSharedSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinAllocateSharedSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinAllocateSharedSelf.testGetSetName) ... ok testMemory (test_win.TestWinAllocateSharedSelf.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateSharedSelf.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateSharedSelf.testPickle) ... ok testPyProps (test_win.TestWinAllocateSharedSelf.testPyProps) ... ok testSharedQuery (test_win.TestWinAllocateSharedSelf.testSharedQuery) ... ok testAttributes (test_win.TestWinAllocateSharedWorld.testAttributes) ... testAttributes (test_win.TestWinAllocateSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinAllocateSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinAllocateSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinAllocateSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinAllocateSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinAllocateSelf.testGetSetName) ... ok testMemory (test_win.TestWinAllocateSelf.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateSelf.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateSelf.testPickle) ... ok testPyProps (test_win.TestWinAllocateSelf.testPyProps) ... ok testAttributes (test_win.TestWinAllocateSharedSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinAllocateSharedSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSharedSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinAllocateSharedSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinAllocateSharedSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinAllocateSharedSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinAllocateSharedSelf.testGetSetName) ... ok testMemory (test_win.TestWinAllocateSharedSelf.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateSharedSelf.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateSharedSelf.testPickle) ... ok testPyProps (test_win.TestWinAllocateSharedSelf.testPyProps) ... ok testSharedQuery (test_win.TestWinAllocateSharedSelf.testSharedQuery) ... ok testAttributes (test_win.TestWinAllocateSharedWorld.testAttributes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCreateFlavor (test_win.TestWinAllocateSharedWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSharedWorld.testGetAttr) ... ok testCreateFlavor (test_win.TestWinAllocateSharedWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSharedWorld.testGetAttr) ... ok testCreateFlavor (test_win.TestWinAllocateSharedWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSharedWorld.testGetAttr) ... ok testCreateFlavor (test_win.TestWinAllocateSharedWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinAllocateSharedWorld.testGetAttr) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetGroup (test_win.TestWinAllocateSharedWorld.testGetGroup) ... ok testGetGroup (test_win.TestWinAllocateSharedWorld.testGetGroup) ... ok testGetGroup (test_win.TestWinAllocateSharedWorld.testGetGroup) ... ok testGetGroup (test_win.TestWinAllocateSharedWorld.testGetGroup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok ok testGetSetErrhandler (test_win.TestWinAllocateSharedWorld.testGetSetErrhandler) ... ok testGetSetErrhandler (test_win.TestWinAllocateSharedWorld.testGetSetErrhandler) ... testGetSetErrhandler (test_win.TestWinAllocateSharedWorld.testGetSetErrhandler) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetErrhandler (test_win.TestWinAllocateSharedWorld.testGetSetErrhandler) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetInfo (test_win.TestWinAllocateSharedWorld.testGetSetInfo) ... ok testGetSetInfo (test_win.TestWinAllocateSharedWorld.testGetSetInfo) ... ok testGetSetInfo (test_win.TestWinAllocateSharedWorld.testGetSetInfo) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetInfo (test_win.TestWinAllocateSharedWorld.testGetSetInfo) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetName (test_win.TestWinAllocateSharedWorld.testGetSetName) ... ok testGetSetName (test_win.TestWinAllocateSharedWorld.testGetSetName) ... ok testGetSetName (test_win.TestWinAllocateSharedWorld.testGetSetName) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetName (test_win.TestWinAllocateSharedWorld.testGetSetName) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMemory (test_win.TestWinAllocateSharedWorld.testMemory) ... ok testMemory (test_win.TestWinAllocateSharedWorld.testMemory) ... ok testMemory (test_win.TestWinAllocateSharedWorld.testMemory) ... ok testMemory (test_win.TestWinAllocateSharedWorld.testMemory) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMemoryModel (test_win.TestWinAllocateSharedWorld.testMemoryModel) ... ok testMemoryModel (test_win.TestWinAllocateSharedWorld.testMemoryModel) ... ok testMemoryModel (test_win.TestWinAllocateSharedWorld.testMemoryModel) ... ok testMemoryModel (test_win.TestWinAllocateSharedWorld.testMemoryModel) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPickle (test_win.TestWinAllocateSharedWorld.testPickle) ... ok testPickle (test_win.TestWinAllocateSharedWorld.testPickle) ... ok testPickle (test_win.TestWinAllocateSharedWorld.testPickle) ... ok testPickle (test_win.TestWinAllocateSharedWorld.testPickle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_win.TestWinAllocateSharedWorld.testPyProps) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_win.TestWinAllocateSharedWorld.testPyProps) ... ok testPyProps (test_win.TestWinAllocateSharedWorld.testPyProps) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_win.TestWinAllocateSharedWorld.testPyProps) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testSharedQuery (test_win.TestWinAllocateSharedWorld.testSharedQuery) ... ok testSharedQuery (test_win.TestWinAllocateSharedWorld.testSharedQuery) ... ok testSharedQuery (test_win.TestWinAllocateSharedWorld.testSharedQuery) ... ok testSharedQuery (test_win.TestWinAllocateSharedWorld.testSharedQuery) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttributes (test_win.TestWinAllocateWorld.testAttributes) ... ok testAttributes (test_win.TestWinAllocateWorld.testAttributes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttributes (test_win.TestWinAllocateWorld.testAttributes) ... ok testAttributes (test_win.TestWinAllocateWorld.testAttributes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCreateFlavor (test_win.TestWinAllocateWorld.testCreateFlavor) ... ok testCreateFlavor (test_win.TestWinAllocateWorld.testCreateFlavor) ... ok testCreateFlavor (test_win.TestWinAllocateWorld.testCreateFlavor) ... ok testCreateFlavor (test_win.TestWinAllocateWorld.testCreateFlavor) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetAttr (test_win.TestWinAllocateWorld.testGetAttr) ... ok testGetAttr (test_win.TestWinAllocateWorld.testGetAttr) ... ok testGetAttr (test_win.TestWinAllocateWorld.testGetAttr) ... ok testGetAttr (test_win.TestWinAllocateWorld.testGetAttr) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetGroup (test_win.TestWinAllocateWorld.testGetGroup) ... ok testGetGroup (test_win.TestWinAllocateWorld.testGetGroup) ... ok testGetGroup (test_win.TestWinAllocateWorld.testGetGroup) ... ok testGetGroup (test_win.TestWinAllocateWorld.testGetGroup) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetErrhandler (test_win.TestWinAllocateWorld.testGetSetErrhandler) ... ok testGetSetErrhandler (test_win.TestWinAllocateWorld.testGetSetErrhandler) ... ok testGetSetErrhandler (test_win.TestWinAllocateWorld.testGetSetErrhandler) ... ok testGetSetErrhandler (test_win.TestWinAllocateWorld.testGetSetErrhandler) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetInfo (test_win.TestWinAllocateWorld.testGetSetInfo) ... ok testGetSetInfo (test_win.TestWinAllocateWorld.testGetSetInfo) ... ok testGetSetInfo (test_win.TestWinAllocateWorld.testGetSetInfo) ... ok testGetSetInfo (test_win.TestWinAllocateWorld.testGetSetInfo) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetName (test_win.TestWinAllocateWorld.testGetSetName) ... ok testGetSetName (test_win.TestWinAllocateWorld.testGetSetName) ... ok testGetSetName (test_win.TestWinAllocateWorld.testGetSetName) ... ok testGetSetName (test_win.TestWinAllocateWorld.testGetSetName) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testMemory (test_win.TestWinAllocateWorld.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateWorld.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateWorld.testPickle) ... ok testMemory (test_win.TestWinAllocateWorld.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateWorld.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateWorld.testPickle) ... ok testMemory (test_win.TestWinAllocateWorld.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateWorld.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateWorld.testPickle) ... ok testMemory (test_win.TestWinAllocateWorld.testMemory) ... ok testMemoryModel (test_win.TestWinAllocateWorld.testMemoryModel) ... ok testPickle (test_win.TestWinAllocateWorld.testPickle) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_win.TestWinAllocateWorld.testPyProps) ... ok testPyProps (test_win.TestWinAllocateWorld.testPyProps) ... ok testPyProps (test_win.TestWinAllocateWorld.testPyProps) ... ok testPyProps (test_win.TestWinAllocateWorld.testPyProps) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttachDetach (test_win.TestWinCreateDynamicSelf.testAttachDetach) ... ok testAttributes (test_win.TestWinCreateDynamicSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateDynamicSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateDynamicSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateDynamicSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateDynamicSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinCreateDynamicSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateDynamicSelf.testGetSetName) ... ok testMemory (test_win.TestWinCreateDynamicSelf.testMemory) ... ok testMemoryModel (test_win.TestWinCreateDynamicSelf.testMemoryModel) ... ok testAttachDetach (test_win.TestWinCreateDynamicSelf.testAttachDetach) ... ok testAttributes (test_win.TestWinCreateDynamicSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateDynamicSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateDynamicSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateDynamicSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateDynamicSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinCreateDynamicSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateDynamicSelf.testGetSetName) ... ok testMemory (test_win.TestWinCreateDynamicSelf.testMemory) ... ok testMemoryModel (test_win.TestWinCreateDynamicSelf.testMemoryModel) ... ok testPickle (test_win.TestWinCreateDynamicSelf.testPickle) ... ok testAttachDetach (test_win.TestWinCreateDynamicSelf.testAttachDetach) ... ok testAttributes (test_win.TestWinCreateDynamicSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateDynamicSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateDynamicSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateDynamicSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateDynamicSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinCreateDynamicSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateDynamicSelf.testGetSetName) ... ok testMemory (test_win.TestWinCreateDynamicSelf.testMemory) ... ok testMemoryModel (test_win.TestWinCreateDynamicSelf.testMemoryModel) ... ok testPickle (test_win.TestWinCreateDynamicSelf.testPickle) ... ok testAttachDetach (test_win.TestWinCreateDynamicSelf.testAttachDetach) ... ok testAttributes (test_win.TestWinCreateDynamicSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateDynamicSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateDynamicSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateDynamicSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateDynamicSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinCreateDynamicSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateDynamicSelf.testGetSetName) ... ok testMemory (test_win.TestWinCreateDynamicSelf.testMemory) ... ok testMemoryModel (test_win.TestWinCreateDynamicSelf.testMemoryModel) ... ok testPickle (test_win.TestWinCreateDynamicSelf.testPickle) ... ok [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPickle (test_win.TestWinCreateDynamicSelf.testPickle) ... ok ok testPyProps (test_win.TestWinCreateDynamicSelf.testPyProps) ... ok testPyProps (test_win.TestWinCreateDynamicSelf.testPyProps) ... testPyProps (test_win.TestWinCreateDynamicSelf.testPyProps) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttachDetach (test_win.TestWinCreateDynamicWorld.testAttachDetach) ... testAttachDetach (test_win.TestWinCreateDynamicWorld.testAttachDetach) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_win.TestWinCreateDynamicSelf.testPyProps) ... ok testAttachDetach (test_win.TestWinCreateDynamicWorld.testAttachDetach) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttributes (test_win.TestWinCreateDynamicWorld.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateDynamicWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateDynamicWorld.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateDynamicWorld.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateDynamicWorld.testGetSetErrhandler) ... ok testAttributes (test_win.TestWinCreateDynamicWorld.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateDynamicWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateDynamicWorld.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateDynamicWorld.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateDynamicWorld.testGetSetErrhandler) ... ok testAttributes (test_win.TestWinCreateDynamicWorld.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateDynamicWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateDynamicWorld.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateDynamicWorld.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateDynamicWorld.testGetSetErrhandler) ... ok testAttachDetach (test_win.TestWinCreateDynamicWorld.testAttachDetach) ... ok testAttributes (test_win.TestWinCreateDynamicWorld.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateDynamicWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateDynamicWorld.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateDynamicWorld.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateDynamicWorld.testGetSetErrhandler) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetInfo (test_win.TestWinCreateDynamicWorld.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateDynamicWorld.testGetSetName) ... ok testMemory (test_win.TestWinCreateDynamicWorld.testMemory) ... ok testMemoryModel (test_win.TestWinCreateDynamicWorld.testMemoryModel) ... ok testPickle (test_win.TestWinCreateDynamicWorld.testPickle) ... ok testGetSetInfo (test_win.TestWinCreateDynamicWorld.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateDynamicWorld.testGetSetName) ... ok testMemory (test_win.TestWinCreateDynamicWorld.testMemory) ... ok testMemoryModel (test_win.TestWinCreateDynamicWorld.testMemoryModel) ... ok testPickle (test_win.TestWinCreateDynamicWorld.testPickle) ... ok testGetSetInfo (test_win.TestWinCreateDynamicWorld.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateDynamicWorld.testGetSetName) ... ok testMemory (test_win.TestWinCreateDynamicWorld.testMemory) ... ok testMemoryModel (test_win.TestWinCreateDynamicWorld.testMemoryModel) ... ok testPickle (test_win.TestWinCreateDynamicWorld.testPickle) ... ok testGetSetInfo (test_win.TestWinCreateDynamicWorld.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateDynamicWorld.testGetSetName) ... ok testMemory (test_win.TestWinCreateDynamicWorld.testMemory) ... ok testMemoryModel (test_win.TestWinCreateDynamicWorld.testMemoryModel) ... ok testPickle (test_win.TestWinCreateDynamicWorld.testPickle) ... ok testPyProps (test_win.TestWinCreateDynamicWorld.testPyProps) ... ok testPyProps (test_win.TestWinCreateDynamicWorld.testPyProps) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testPyProps (test_win.TestWinCreateDynamicWorld.testPyProps) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testAttributes (test_win.TestWinCreateSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinCreateSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateSelf.testGetSetName) ... ok testMemory (test_win.TestWinCreateSelf.testMemory) ... ok testMemoryModel (test_win.TestWinCreateSelf.testMemoryModel) ... ok testPickle (test_win.TestWinCreateSelf.testPickle) ... ok testPyProps (test_win.TestWinCreateSelf.testPyProps) ... ok testAttributes (test_win.TestWinCreateSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinCreateSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateSelf.testGetSetName) ... ok testMemory (test_win.TestWinCreateSelf.testMemory) ... ok testMemoryModel (test_win.TestWinCreateSelf.testMemoryModel) ... ok testPickle (test_win.TestWinCreateSelf.testPickle) ... ok ok testAttributes (test_win.TestWinCreateSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinCreateSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateSelf.testGetSetName) ... ok testMemory (test_win.TestWinCreateSelf.testMemory) ... ok testMemoryModel (test_win.TestWinCreateSelf.testMemoryModel) ... ok testPickle (test_win.TestWinCreateSelf.testPickle) ... ok testPyProps (test_win.TestWinCreateSelf.testPyProps) ... ok testAttributes (test_win.TestWinCreateWorld.testAttributes) ... ok testPyProps (test_win.TestWinCreateDynamicWorld.testPyProps) ... ok testAttributes (test_win.TestWinCreateSelf.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateSelf.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateSelf.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateSelf.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateSelf.testGetSetErrhandler) ... ok testGetSetInfo (test_win.TestWinCreateSelf.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateSelf.testGetSetName) ... ok testMemory (test_win.TestWinCreateSelf.testMemory) ... ok testMemoryModel (test_win.TestWinCreateSelf.testMemoryModel) ... ok testPickle (test_win.TestWinCreateSelf.testPickle) ... ok testPyProps (test_win.TestWinCreateSelf.testPyProps) ... ok ok testAttributes (test_win.TestWinCreateWorld.testAttributes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR testAttributes (test_win.TestWinCreateWorld.testAttributes) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testCreateFlavor (test_win.TestWinCreateWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateWorld.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateWorld.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateWorld.testGetSetErrhandler) ... testPyProps (test_win.TestWinCreateSelf.testPyProps) ... ok testAttributes (test_win.TestWinCreateWorld.testAttributes) ... ok testCreateFlavor (test_win.TestWinCreateWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateWorld.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateWorld.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateWorld.testGetSetErrhandler) ... ok testCreateFlavor (test_win.TestWinCreateWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateWorld.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateWorld.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateWorld.testGetSetErrhandler) ... ok testCreateFlavor (test_win.TestWinCreateWorld.testCreateFlavor) ... ok testGetAttr (test_win.TestWinCreateWorld.testGetAttr) ... ok testGetGroup (test_win.TestWinCreateWorld.testGetGroup) ... ok testGetSetErrhandler (test_win.TestWinCreateWorld.testGetSetErrhandler) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testGetSetInfo (test_win.TestWinCreateWorld.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateWorld.testGetSetName) ... ok testMemory (test_win.TestWinCreateWorld.testMemory) ... ok testMemoryModel (test_win.TestWinCreateWorld.testMemoryModel) ... ok testPickle (test_win.TestWinCreateWorld.testPickle) ... ok testPyProps (test_win.TestWinCreateWorld.testPyProps) ... ok testGetSetInfo (test_win.TestWinCreateWorld.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateWorld.testGetSetName) ... ok testMemory (test_win.TestWinCreateWorld.testMemory) ... ok testMemoryModel (test_win.TestWinCreateWorld.testMemoryModel) ... ok testPickle (test_win.TestWinCreateWorld.testPickle) ... ok testPyProps (test_win.TestWinCreateWorld.testPyProps) ... ok testGetSetInfo (test_win.TestWinCreateWorld.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateWorld.testGetSetName) ... ok testMemory (test_win.TestWinCreateWorld.testMemory) ... ok testMemoryModel (test_win.TestWinCreateWorld.testMemoryModel) ... ok testPickle (test_win.TestWinCreateWorld.testPickle) ... ok testPyProps (test_win.TestWinCreateWorld.testPyProps) ... ok testGetSetInfo (test_win.TestWinCreateWorld.testGetSetInfo) ... ok testGetSetName (test_win.TestWinCreateWorld.testGetSetName) ... ok testMemory (test_win.TestWinCreateWorld.testMemory) ... ok testMemoryModel (test_win.TestWinCreateWorld.testMemoryModel) ... ok testPickle (test_win.TestWinCreateWorld.testPickle) ... ok testPyProps (test_win.TestWinCreateWorld.testPyProps) ... [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_STDERR ok testConstructor (test_win.TestWinNull.testConstructor) ... ok testGetName (test_win.TestWinNull.testGetName) ... ok ---------------------------------------------------------------------- Ran 2081 tests in 415.805s OK (skipped=99) ok testConstructor (test_win.TestWinNull.testConstructor) ... ok testGetName (test_win.TestWinNull.testGetName) ... ok ---------------------------------------------------------------------- Ran 2081 tests in 415.804s OK (skipped=99) ok testConstructor (test_win.TestWinNull.testConstructor) ... ok testGetName (test_win.TestWinNull.testGetName) ... ok ---------------------------------------------------------------------- Ran 2081 tests in 415.794s OK (skipped=99) ok testConstructor (test_win.TestWinNull.testConstructor) ... ok testGetName (test_win.TestWinNull.testGetName) ... ok ---------------------------------------------------------------------- Ran 2081 tests in 415.790s OK (skipped=99) [proxy:0@virt32a] got pmi command from downstream 0-1: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 0-2: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 0-3: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] got pmi command from downstream 0-0: cmd=finalize [proxy:0@virt32a] Sending PMI command: cmd=finalize_ack rc=0 [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS [proxy:0@virt32a] Sending upstream hdr.cmd = CMD_EXIT_STATUS make[1]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0' create-stamp debian/debhelper-build-stamp dh_testroot -O--buildsystem=pybuild dh_prep -O--buildsystem=pybuild debian/rules override_dh_auto_install make[1]: Entering directory '/build/reproducible-path/mpi4py-4.0.0' dh_auto_install I: pybuild base:311: /usr/bin/python3 setup.py install --root /build/reproducible-path/mpi4py-4.0.0/debian/tmp running install running build running build_src running build_py copying src/mpi4py/MPI.h -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py copying src/mpi4py/MPI_api.h -> /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py running build_ext MPI configuration: [mpi] from 'mpi.cfg' MPI C compiler: /usr/bin/mpicc MPI C++ compiler: /usr/bin/mpicxx running install_lib creating /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr creating /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib creating /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12 creating /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages creating /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/py.typed -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/MPI.h -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/typing.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/run.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/MPI.pxd -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/MPI.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/__main__.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py creating /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/__pycache__/__init__.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/__pycache__/typing.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/__pycache__/run.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/run.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py creating /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/dtlib.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/pool.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/sync.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/pkl5.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util creating /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/__pycache__/sync.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/__pycache__/__init__.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/__pycache__/pkl5.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/__pycache__/dtlib.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/__pycache__/pool.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/sync.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/pkl5.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/pool.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/__init__.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/__init__.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/util/dtlib.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/__init__.pxd -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/bench.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/MPI_api.h -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/MPI.cpython-312-arm-linux-gnueabihf.so -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/__init__.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/bench.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/__init__.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py creating /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/include creating /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/include/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/include/mpi4py/mpi.pxi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/include/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/include/mpi4py/pycapi.h -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/include/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/include/mpi4py/mpi4py.i -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/include/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/include/mpi4py/mpi4py.h -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/include/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/libmpi.pxd -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py creating /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/aplus.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/pool.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/__main__.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/server.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/aplus.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures creating /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/__pycache__/__init__.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/__pycache__/_core.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/__pycache__/pool.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/__pycache__/server.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/__pycache__/util.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/__pycache__/_base.cpython-312.pyc -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/__pycache__ copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/server.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/util.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/_base.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/pool.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/_core.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/_base.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/util.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/__init__.pyi -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/__init__.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/_core.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/futures/__main__.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/typing.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py copying /build/reproducible-path/mpi4py-4.0.0/.pybuild/cpython3_3.12/build/mpi4py/__main__.py -> /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/run.py to run.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util/sync.py to sync.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util/pkl5.py to pkl5.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util/pool.py to pool.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util/__init__.py to __init__.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/util/dtlib.py to dtlib.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/bench.py to bench.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/__init__.py to __init__.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/aplus.py to aplus.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/server.py to server.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/_base.py to _base.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/pool.py to pool.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/util.py to util.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/__init__.py to __init__.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/_core.py to _core.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/futures/__main__.py to __main__.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/typing.py to typing.cpython-312.pyc byte-compiling /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py/__main__.py to __main__.cpython-312.pyc running install_egg_info running egg_info creating src/mpi4py.egg-info writing src/mpi4py.egg-info/PKG-INFO writing dependency_links to src/mpi4py.egg-info/dependency_links.txt writing top-level names to src/mpi4py.egg-info/top_level.txt writing manifest file 'src/mpi4py.egg-info/SOURCES.txt' reading manifest file 'src/mpi4py.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' adding license file 'LICENSE.rst' writing manifest file 'src/mpi4py.egg-info/SOURCES.txt' Copying src/mpi4py.egg-info to /build/reproducible-path/mpi4py-4.0.0/debian/tmp/usr/lib/python3.12/dist-packages/mpi4py-4.0.0.egg-info Skipping SOURCES.txt running install_scripts : # Remove python-mpi binaries find /build/reproducible-path/mpi4py-4.0.0/debian/tmp -name python-mpi -delete find /build/reproducible-path/mpi4py-4.0.0/debian/tmp -type d -empty -delete make[1]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0' debian/rules override_dh_install-arch make[1]: Entering directory '/build/reproducible-path/mpi4py-4.0.0' dh_install : # create symlinks for .h files : # Can have python$v symlink pointing to python3.?m or python3.?mu : # see #700995 for more details. So first look where it points to : # and use that directory set -e; for v in 3.12; do \ ABITAG=`python$v -c "import sys; print(sys.abiflags)"`; \ [ -d /build/reproducible-path/mpi4py-4.0.0/debian/python3-mpi4py/usr/include/python$v$ABITAG ] || \ mkdir -p /build/reproducible-path/mpi4py-4.0.0/debian/python3-mpi4py/usr/include/python$v$ABITAG; \ dh_link -ppython3-mpi4py usr/lib/python3/dist-packages/mpi4py/include/mpi4py usr/include/python$v$ABITAG/mpi4py; \ done make[1]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0' dh_install -O--buildsystem=pybuild -Npython3-mpi4py dh_installdocs -O--buildsystem=pybuild dh_installdocs: warning: Cannot auto-detect main package for python-mpi4py-doc. If the default is wrong, please use --doc-main-package debian/rules override_dh_sphinxdoc make[1]: Entering directory '/build/reproducible-path/mpi4py-4.0.0' dh_sphinxdoc -ppython-mpi4py-doc make[1]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0' debian/rules override_dh_installchangelogs make[1]: Entering directory '/build/reproducible-path/mpi4py-4.0.0' dh_installchangelogs CHANGES.rst make[1]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0' dh_installman -O--buildsystem=pybuild dh_installinfo -O--buildsystem=pybuild dh_python3 -O--buildsystem=pybuild dh_installsystemduser -O--buildsystem=pybuild dh_perl -O--buildsystem=pybuild dh_link -O--buildsystem=pybuild dh_strip_nondeterminism -O--buildsystem=pybuild debian/rules override_dh_compress make[1]: Entering directory '/build/reproducible-path/mpi4py-4.0.0' dh_compress -X.py -X.html -X.css -X.jpg -X.txt -X.js -X.json -X.rtc -X.par -X.bin -Xobjects.inv -X.pdf make[1]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0' dh_fixperms -O--buildsystem=pybuild dh_missing -O--buildsystem=pybuild debian/rules override_dh_dwz make[1]: Entering directory '/build/reproducible-path/mpi4py-4.0.0' echo "dh_dwz is currently deactivated since it generates an error: \".debug_line reference above end of section\"" dh_dwz is currently deactivated since it generates an error: ".debug_line reference above end of section" make[1]: Leaving directory '/build/reproducible-path/mpi4py-4.0.0' dh_strip -a -O--buildsystem=pybuild dh_makeshlibs -a -O--buildsystem=pybuild dh_shlibdeps -a -O--buildsystem=pybuild dpkg-shlibdeps: warning: diversions involved - output may be incorrect diversion by libc6 from: /lib/ld-linux-armhf.so.3 dpkg-shlibdeps: warning: diversions involved - output may be incorrect diversion by libc6 to: /lib/ld-linux-armhf.so.3.usr-is-merged dh_installdeb -O--buildsystem=pybuild dh_gencontrol -O--buildsystem=pybuild dpkg-gencontrol: warning: package python-mpi4py-doc: substitution variable ${sphinxdoc:Built-Using} unused, but is defined dh_md5sums -O--buildsystem=pybuild dh_builddeb -O--buildsystem=pybuild dpkg-deb: building package 'python3-mpi4py' in '../python3-mpi4py_4.0.0-8_armhf.deb'. dpkg-deb: building package 'python-mpi4py-doc' in '../python-mpi4py-doc_4.0.0-8_all.deb'. dpkg-deb: building package 'python3-mpi4py-dbgsym' in '../python3-mpi4py-dbgsym_4.0.0-8_armhf.deb'. dpkg-genbuildinfo --build=binary -O../mpi4py_4.0.0-8_armhf.buildinfo dpkg-genchanges --build=binary -O../mpi4py_4.0.0-8_armhf.changes dpkg-genchanges: info: binary-only upload (no source code included) dpkg-source --after-build . dpkg-buildpackage: info: binary-only upload (no source included) dpkg-genchanges: info: not including original source code in upload I: copying local configuration I: unmounting dev/ptmx filesystem I: unmounting dev/pts filesystem I: unmounting dev/shm filesystem I: unmounting proc filesystem I: unmounting sys filesystem I: cleaning the build env I: removing directory /srv/workspace/pbuilder/20615 and its subdirectories I: Current time: Wed Nov 6 23:00:02 -12 2024 I: pbuilder-time-stamp: 1730977202 Thu Nov 7 11:00:31 UTC 2024 I: 1st build successful. Starting 2nd build on remote node virt64z-armhf-rb.debian.net. Thu Nov 7 11:00:31 UTC 2024 I: Preparing to do remote build '2' on virt64z-armhf-rb.debian.net. Thu Nov 7 12:15:43 UTC 2024 I: Deleting $TMPDIR on virt64z-armhf-rb.debian.net. Thu Nov 7 12:15:46 UTC 2024 I: mpi4py_4.0.0-8_armhf.changes: Format: 1.8 Date: Sun, 29 Sep 2024 09:51:16 +0200 Source: mpi4py Binary: python-mpi4py-doc python3-mpi4py python3-mpi4py-dbgsym Architecture: all armhf Version: 4.0.0-8 Distribution: unstable Urgency: medium Maintainer: Debian Science Maintainers Changed-By: Drew Parsons Description: python-mpi4py-doc - bindings of the MPI standard -- documentation python3-mpi4py - bindings of the Message Passing Interface (MPI) standard Changes: mpi4py (4.0.0-8) unstable; urgency=medium . * python3-mpi4py Breaks: python3-pyzoltan (<< 1.0.1-10~) mpi4py 4 requires pyzoltan to patch the definition of MPI_Session Checksums-Sha1: 517cb98d3d11fe4383928694bfa7d6ab7ffa1990 10723 mpi4py_4.0.0-8_armhf.buildinfo 4a9edacb03865f42364c8638063622e205938d43 1518280 python-mpi4py-doc_4.0.0-8_all.deb 52cfeeadba45ebf10337fc15be20b10e4c1025e6 4121080 python3-mpi4py-dbgsym_4.0.0-8_armhf.deb 29a46baa3f13efe03e18f956a146ec54d34c12b1 644160 python3-mpi4py_4.0.0-8_armhf.deb Checksums-Sha256: 9aa91c1f7acae738cfc0fb97bda2f4e032ecfc81bf75b2d841360759718cfa3e 10723 mpi4py_4.0.0-8_armhf.buildinfo 2d0368434a7d38b8e8c07e7338b01119f27c234682b78a7f38400a24c47a0ecf 1518280 python-mpi4py-doc_4.0.0-8_all.deb f26f3a33036a821c4684df1ce7535414b038b74867129e4e69b6c8f70e30d543 4121080 python3-mpi4py-dbgsym_4.0.0-8_armhf.deb 5edf251c784891207e51a65814e545b9defdf6870196b1eb11b66e36c8340778 644160 python3-mpi4py_4.0.0-8_armhf.deb Files: ff41330c2116e75801e02674f6fe24cb 10723 python optional mpi4py_4.0.0-8_armhf.buildinfo 02c454c7b796941473a2a689503d246a 1518280 doc optional python-mpi4py-doc_4.0.0-8_all.deb 371daac19797c982f440bbff8d83449e 4121080 debug optional python3-mpi4py-dbgsym_4.0.0-8_armhf.deb d605f96003ff1bbb679633dc937d0e63 644160 python optional python3-mpi4py_4.0.0-8_armhf.deb Thu Nov 7 12:15:55 UTC 2024 I: diffoscope 282 will be used to compare the two builds: Running as unit: rb-diffoscope-armhf_9-10027.service # Profiling output for: /usr/bin/diffoscope --timeout 7200 --html /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/mpi4py_4.0.0-8.diffoscope.html --text /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/mpi4py_4.0.0-8.diffoscope.txt --json /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/mpi4py_4.0.0-8.diffoscope.json --profile=- /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b1/mpi4py_4.0.0-8_armhf.changes /srv/reproducible-results/rbuild-debian/r-b-build.vJQMYox3/b2/mpi4py_4.0.0-8_armhf.changes ## close_archive (total time: 0.000s) 0.000s 6 calls diffoscope.comparators.xz.XzContainer 0.000s 2 calls diffoscope.comparators.deb.DebContainer 0.000s 4 calls diffoscope.comparators.tar.TarContainer 0.000s 2 calls diffoscope.comparators.gzip.GzipContainer 0.000s 2 calls diffoscope.comparators.deb.DebTarContainer ## command (total time: 9.210s) 5.124s 10 calls xxd 3.106s 6 calls diff 0.889s 6 calls xz 0.030s 4 calls cmp 0.030s 4 calls cmp (external) 0.029s 2 calls gzip 0.002s 16 calls cmp (internal) ## compare_files (cumulative) (total time: 37.938s) 8.180s 1 call abc.DotChangesFile 7.585s 1 call abc.DebFile 7.415s 2 calls abc.XzFile 6.441s 1 call abc.DebDataTarFile 4.157s 1 call abc.GzipFile 4.086s 1 call diffoscope.comparators.utils.archive.ArchiveMember 0.053s 1 call abc.TarFile 0.021s 1 call abc.Md5sumsFile ## container_extract (total time: 2.058s) 0.996s 810 calls diffoscope.comparators.deb.DebTarContainer 0.889s 6 calls diffoscope.comparators.xz.XzContainer 0.141s 8 calls diffoscope.comparators.deb.DebContainer 0.029s 2 calls diffoscope.comparators.gzip.GzipContainer 0.004s 6 calls diffoscope.comparators.tar.TarContainer ## diff (total time: 0.113s) 0.113s 7 calls linediff ## has_same_content_as (total time: 0.034s) 0.014s 1 call abc.DebDataTarFile 0.012s 3 calls abc.DebFile 0.006s 1 call diffoscope.comparators.utils.archive.ArchiveMember 0.001s 5 calls diffoscope.comparators.utils.libarchive.LibarchiveSymlink 0.000s 2 calls abc.Md5sumsFile 0.000s 2 calls abc.XzFile 0.000s 2 calls abc.TextFile 0.000s 1 call abc.DotChangesFile 0.000s 1 call abc.TarFile 0.000s 1 call abc.GzipFile ## main (total time: 12.030s) 12.028s 2 calls outputs 0.001s 1 call cleanup ## open_archive (total time: 0.000s) 0.000s 6 calls diffoscope.comparators.xz.XzContainer 0.000s 2 calls diffoscope.comparators.gzip.GzipContainer 0.000s 4 calls diffoscope.comparators.tar.TarContainer 0.000s 2 calls diffoscope.comparators.deb.DebContainer 0.000s 2 calls diffoscope.comparators.deb.DebTarContainer ## output (total time: 0.163s) 0.162s 1 call html 0.001s 1 call text 0.000s 1 call json ## recognizes (total time: 1.093s) 0.953s 12 calls diffoscope.comparators.binary.FilesystemFile 0.058s 584 calls diffoscope.comparators.utils.libarchive.LibarchiveMember 0.050s 632 calls diffoscope.comparators.utils.archive.ArchiveMember 0.032s 162 calls diffoscope.comparators.debian.DebControlMember ## specialize (total time: 0.136s) 0.136s 19 calls specialize Finished with result: success Main processes terminated with: code=exited/status=1 Service runtime: 12.415s CPU time consumed: 6.555s Thu Nov 7 12:16:09 UTC 2024 W: Diffoscope claims the build is reproducible, but there is a diffoscope file. Please investigate. Thu Nov 7 12:16:09 UTC 2024 E: mpi4py failed to build reproducibly in trixie on armhf. Thu Nov 7 12:16:17 UTC 2024 I: Submitting .buildinfo files to external archives: Thu Nov 7 12:16:17 UTC 2024 I: Submitting 12K b1/mpi4py_4.0.0-8_armhf.buildinfo.asc Thu Nov 7 12:16:18 UTC 2024 I: Submitting 12K b2/mpi4py_4.0.0-8_armhf.buildinfo.asc Thu Nov 7 12:16:19 UTC 2024 I: Done submitting .buildinfo files to http://buildinfo.debian.net/api/submit. Thu Nov 7 12:16:19 UTC 2024 I: Done submitting .buildinfo files. Thu Nov 7 12:16:19 UTC 2024 I: Removing signed mpi4py_4.0.0-8_armhf.buildinfo.asc files: removed './b1/mpi4py_4.0.0-8_armhf.buildinfo.asc' removed './b2/mpi4py_4.0.0-8_armhf.buildinfo.asc'