Diff of the two buildlogs: -- --- b1/build.log 2024-11-09 09:45:20.756575011 +0000 +++ b2/build.log 2024-11-09 09:51:51.782624520 +0000 @@ -1,6 +1,6 @@ I: pbuilder: network access will be disabled during build -I: Current time: Fri Nov 8 21:43:21 -12 2024 -I: pbuilder-time-stamp: 1731145401 +I: Current time: Sat Nov 9 23:45:35 +14 2024 +I: pbuilder-time-stamp: 1731145535 I: Building the build Environment I: extracting base tarball [/var/cache/pbuilder/unstable-reproducible-base.tgz] I: copying local configuration @@ -27,52 +27,84 @@ dpkg-source: info: applying no-local-ips.patch I: Not using root during the build. I: Installing the build-deps -I: user script /srv/workspace/pbuilder/15679/tmp/hooks/D02_print_environment starting +I: user script /srv/workspace/pbuilder/3599/tmp/hooks/D01_modify_environment starting +debug: Running on ff4a. +I: Changing host+domainname to test build reproducibility +I: Adding a custom variable just for the fun of it... +I: Changing /bin/sh to bash +'/bin/sh' -> '/bin/bash' +lrwxrwxrwx 1 root root 9 Nov 9 09:46 /bin/sh -> /bin/bash +I: Setting pbuilder2's login shell to /bin/bash +I: Setting pbuilder2's GECOS to second user,second room,second work-phone,second home-phone,second other +I: user script /srv/workspace/pbuilder/3599/tmp/hooks/D01_modify_environment finished +I: user script /srv/workspace/pbuilder/3599/tmp/hooks/D02_print_environment starting I: set - BUILDDIR='/build/reproducible-path' - BUILDUSERGECOS='first user,first room,first work-phone,first home-phone,first other' - BUILDUSERNAME='pbuilder1' - BUILD_ARCH='armhf' - DEBIAN_FRONTEND='noninteractive' - DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=3 ' - DISTRIBUTION='unstable' - HOME='/root' - HOST_ARCH='armhf' + BASH=/bin/sh + BASHOPTS=checkwinsize:cmdhist:complete_fullquote:extquote:force_fignore:globasciiranges:globskipdots:hostcomplete:interactive_comments:patsub_replacement:progcomp:promptvars:sourcepath + BASH_ALIASES=() + BASH_ARGC=() + BASH_ARGV=() + BASH_CMDS=() + BASH_LINENO=([0]="12" [1]="0") + BASH_LOADABLES_PATH=/usr/local/lib/bash:/usr/lib/bash:/opt/local/lib/bash:/usr/pkg/lib/bash:/opt/pkg/lib/bash:. + BASH_SOURCE=([0]="/tmp/hooks/D02_print_environment" [1]="/tmp/hooks/D02_print_environment") + BASH_VERSINFO=([0]="5" [1]="2" [2]="32" [3]="1" [4]="release" [5]="arm-unknown-linux-gnueabihf") + BASH_VERSION='5.2.32(1)-release' + BUILDDIR=/build/reproducible-path + BUILDUSERGECOS='second user,second room,second work-phone,second home-phone,second other' + BUILDUSERNAME=pbuilder2 + BUILD_ARCH=armhf + DEBIAN_FRONTEND=noninteractive + DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=4 ' + DIRSTACK=() + DISTRIBUTION=unstable + EUID=0 + FUNCNAME=([0]="Echo" [1]="main") + GROUPS=() + HOME=/root + HOSTNAME=i-capture-the-hostname + HOSTTYPE=arm + HOST_ARCH=armhf IFS=' ' - INVOCATION_ID='11948fd6d2674069a65338d4d607af00' - LANG='C' - LANGUAGE='en_US:en' - LC_ALL='C' - MAIL='/var/mail/root' - OPTIND='1' - PATH='/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' - PBCURRENTCOMMANDLINEOPERATION='build' - PBUILDER_OPERATION='build' - PBUILDER_PKGDATADIR='/usr/share/pbuilder' - PBUILDER_PKGLIBDIR='/usr/lib/pbuilder' - PBUILDER_SYSCONFDIR='/etc' - PPID='15679' - PS1='# ' - PS2='> ' + INVOCATION_ID=09200e497f9c47e1952cb1125240d64e + LANG=C + LANGUAGE=it_CH:it + LC_ALL=C + MACHTYPE=arm-unknown-linux-gnueabihf + MAIL=/var/mail/root + OPTERR=1 + OPTIND=1 + OSTYPE=linux-gnueabihf + PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path + PBCURRENTCOMMANDLINEOPERATION=build + PBUILDER_OPERATION=build + PBUILDER_PKGDATADIR=/usr/share/pbuilder + PBUILDER_PKGLIBDIR=/usr/lib/pbuilder + PBUILDER_SYSCONFDIR=/etc + PIPESTATUS=([0]="0") + POSIXLY_CORRECT=y + PPID=3599 PS4='+ ' - PWD='/' - SHELL='/bin/bash' - SHLVL='2' - SUDO_COMMAND='/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.KsRYi9Ws/pbuilderrc_ObRm --distribution unstable --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/unstable-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.KsRYi9Ws/b1 --logfile b1/build.log servefile_0.5.4-3.1.dsc' - SUDO_GID='113' - SUDO_UID='107' - SUDO_USER='jenkins' - TERM='unknown' - TZ='/usr/share/zoneinfo/Etc/GMT+12' - USER='root' - _='/usr/bin/systemd-run' - http_proxy='http://10.0.0.15:3142/' + PWD=/ + SHELL=/bin/bash + SHELLOPTS=braceexpand:errexit:hashall:interactive-comments:posix + SHLVL=3 + SUDO_COMMAND='/usr/bin/timeout -k 24.1h 24h /usr/bin/ionice -c 3 /usr/bin/nice -n 11 /usr/bin/unshare --uts -- /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.KsRYi9Ws/pbuilderrc_DrMv --distribution unstable --hookdir /etc/pbuilder/rebuild-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/unstable-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.KsRYi9Ws/b2 --logfile b2/build.log servefile_0.5.4-3.1.dsc' + SUDO_GID=113 + SUDO_UID=107 + SUDO_USER=jenkins + TERM=unknown + TZ=/usr/share/zoneinfo/Etc/GMT-14 + UID=0 + USER=root + _='I: set' + http_proxy=http://10.0.0.15:3142/ I: uname -a - Linux virt64c 6.1.0-26-arm64 #1 SMP Debian 6.1.112-1 (2024-09-30) aarch64 GNU/Linux + Linux i-capture-the-hostname 6.1.0-26-armmp-lpae #1 SMP Debian 6.1.112-1 (2024-09-30) armv7l GNU/Linux I: ls -l /bin lrwxrwxrwx 1 root root 7 Aug 4 21:30 /bin -> usr/bin -I: user script /srv/workspace/pbuilder/15679/tmp/hooks/D02_print_environment finished +I: user script /srv/workspace/pbuilder/3599/tmp/hooks/D02_print_environment finished -> Attempting to satisfy build-dependencies -> Creating pbuilder-satisfydepends-dummy package Package: pbuilder-satisfydepends-dummy @@ -209,7 +241,7 @@ Get: 83 http://deb.debian.org/debian unstable/main armhf python3-pytest all 8.3.3-1 [249 kB] Get: 84 http://deb.debian.org/debian unstable/main armhf python3-urllib3 all 2.0.7-2 [111 kB] Get: 85 http://deb.debian.org/debian unstable/main armhf python3-requests all 2.32.3+dfsg-1 [71.9 kB] -Fetched 31.5 MB in 1s (23.1 MB/s) +Fetched 31.5 MB in 5s (6719 kB/s) debconf: delaying package configuration, since apt-utils is not installed Selecting previously unselected package libpython3.12-minimal:armhf. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 19690 files and directories currently installed.) @@ -498,8 +530,8 @@ Setting up tzdata (2024b-3) ... Current default time zone: 'Etc/UTC' -Local time is now: Sat Nov 9 09:44:16 UTC 2024. -Universal Time is now: Sat Nov 9 09:44:16 UTC 2024. +Local time is now: Sat Nov 9 09:49:02 UTC 2024. +Universal Time is now: Sat Nov 9 09:49:02 UTC 2024. Run 'dpkg-reconfigure tzdata' if you wish to change it. Setting up libcap2-bin (1:2.66-5+b1) ... @@ -585,7 +617,11 @@ Building tag database... -> Finished parsing the build-deps I: Building the package -I: Running cd /build/reproducible-path/servefile-0.5.4/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-genchanges -S > ../servefile_0.5.4-3.1_source.changes +I: user script /srv/workspace/pbuilder/3599/tmp/hooks/A99_set_merged_usr starting +Not re-configuring usrmerge for unstable +I: user script /srv/workspace/pbuilder/3599/tmp/hooks/A99_set_merged_usr finished +hostname: Name or service not known +I: Running cd /build/reproducible-path/servefile-0.5.4/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path" HOME="/nonexistent/second-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path" HOME="/nonexistent/second-build" dpkg-genchanges -S > ../servefile_0.5.4-3.1_source.changes dpkg-buildpackage: info: source package servefile dpkg-buildpackage: info: source version 0.5.4-3.1 dpkg-buildpackage: info: source distribution unstable @@ -620,9 +656,9 @@ running build running build_py creating /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build/servefile -copying servefile/__main__.py -> /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build/servefile -copying servefile/__init__.py -> /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build/servefile copying servefile/servefile.py -> /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build/servefile +copying servefile/__init__.py -> /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build/servefile +copying servefile/__main__.py -> /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build/servefile debian/rules override_dh_auto_test make[1]: Entering directory '/build/reproducible-path/servefile-0.5.4' http_proxy= https_proxy= dh_auto_test -- --test-pytest @@ -633,91 +669,1394 @@ plugins: typeguard-4.4.1 collected 19 items -tests/test_servefile.py ................... [100%] +tests/test_servefile.py ................FF. [100%] -============================= 19 passed in 11.27s ============================== -make[1]: Leaving directory '/build/reproducible-path/servefile-0.5.4' - create-stamp debian/debhelper-build-stamp - dh_testroot -O--buildsystem=pybuild - dh_prep -O--buildsystem=pybuild - dh_auto_install --destdir=debian/servefile/ -O--buildsystem=pybuild -I: pybuild base:311: /usr/bin/python3 setup.py install --root /build/reproducible-path/servefile-0.5.4/debian/servefile -/usr/lib/python3/dist-packages/setuptools/_distutils/dist.py:261: UserWarning: Unknown distribution option: 'tests_require' - warnings.warn(msg) -running install -/usr/lib/python3/dist-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated. -!! - - ******************************************************************************** - Please avoid running ``setup.py`` directly. - Instead, use pypa/build, pypa/installer or other - standards-based tools. +=================================== FAILURES =================================== +__________________________________ test_https __________________________________ - See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. - ******************************************************************************** +self = -!! - self.initialize_options() -running build -running build_py -running install_lib -creating /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages -creating /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages/servefile -creating /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages/servefile/__pycache__ -copying /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build/servefile/__pycache__/__main__.cpython-312.pyc -> /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages/servefile/__pycache__ -copying /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build/servefile/__pycache__/__init__.cpython-312.pyc -> /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages/servefile/__pycache__ -copying /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build/servefile/__pycache__/servefile.cpython-312.pyc -> /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages/servefile/__pycache__ -copying /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build/servefile/__main__.py -> /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages/servefile -copying /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build/servefile/__init__.py -> /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages/servefile -copying /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build/servefile/servefile.py -> /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages/servefile -byte-compiling /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages/servefile/__main__.py to __main__.cpython-312.pyc -byte-compiling /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages/servefile/__init__.py to __init__.cpython-312.pyc -byte-compiling /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages/servefile/servefile.py to servefile.cpython-312.pyc -running install_egg_info -running egg_info -creating servefile.egg-info -writing servefile.egg-info/PKG-INFO -writing dependency_links to servefile.egg-info/dependency_links.txt -writing entry points to servefile.egg-info/entry_points.txt -writing requirements to servefile.egg-info/requires.txt -writing top-level names to servefile.egg-info/top_level.txt -writing manifest file 'servefile.egg-info/SOURCES.txt' -reading manifest file 'servefile.egg-info/SOURCES.txt' -reading manifest template 'MANIFEST.in' -writing manifest file 'servefile.egg-info/SOURCES.txt' -Copying servefile.egg-info to /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/lib/python3.12/dist-packages/servefile-0.5.4.egg-info -Skipping SOURCES.txt -running install_scripts -Installing servefile script to /build/reproducible-path/servefile-0.5.4/debian/servefile/usr/bin - dh_installdocs -O--buildsystem=pybuild - dh_installchangelogs -O--buildsystem=pybuild - dh_installman -O--buildsystem=pybuild - dh_python3 -O--buildsystem=pybuild - dh_installsystemduser -O--buildsystem=pybuild - dh_perl -O--buildsystem=pybuild - dh_link -O--buildsystem=pybuild - dh_strip_nondeterminism -O--buildsystem=pybuild - dh_compress -O--buildsystem=pybuild - dh_fixperms -O--buildsystem=pybuild - dh_missing -O--buildsystem=pybuild - dh_installdeb -O--buildsystem=pybuild - dh_gencontrol -O--buildsystem=pybuild - dh_md5sums -O--buildsystem=pybuild - dh_builddeb -O--buildsystem=pybuild -dpkg-deb: building package 'servefile' in '../servefile_0.5.4-3.1_all.deb'. - dpkg-genbuildinfo --build=binary -O../servefile_0.5.4-3.1_armhf.buildinfo - dpkg-genchanges --build=binary -O../servefile_0.5.4-3.1_armhf.changes -dpkg-genchanges: info: binary-only upload (no source code included) - dpkg-source --after-build . -dpkg-buildpackage: info: binary-only upload (no source included) -dpkg-genchanges: info: not including original source code in upload + def _new_conn(self) -> socket.socket: + """Establish a socket connection and set nodelay settings on it. + + :return: New socket connection. + """ + try: +> sock = connection.create_connection( + (self._dns_host, self.port), + self.timeout, + source_address=self.source_address, + socket_options=self.socket_options, + ) + +/usr/lib/python3/dist-packages/urllib3/connection.py:203: +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ +/usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection + raise err +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ + +address = ('localhost', 58835), timeout = None, source_address = None +socket_options = [(6, 1, 1)] + + def create_connection( + address: tuple[str, int], + timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, + source_address: tuple[str, int] | None = None, + socket_options: _TYPE_SOCKET_OPTIONS | None = None, + ) -> socket.socket: + """Connect to *address* and return the socket object. + + Convenience function. Connect to *address* (a 2-tuple ``(host, + port)``) and return the socket object. Passing the optional + *timeout* parameter will set the timeout on the socket instance + before attempting to connect. If no *timeout* is supplied, the + global default timeout setting returned by :func:`socket.getdefaulttimeout` + is used. If *source_address* is set it must be a tuple of (host, port) + for the socket to bind as a source address before making the connection. + An host of '' or port 0 tells the OS to use the default. + """ + + host, port = address + if host.startswith("["): + host = host.strip("[]") + err = None + + # Using the value from allowed_gai_family() in the context of getaddrinfo lets + # us select whether to work with IPv4 DNS records, IPv6 records, or both. + # The original create_connection function always returns all records. + family = allowed_gai_family() + + try: + host.encode("idna") + except UnicodeError: + raise LocationParseError(f"'{host}', label empty or too long") from None + + for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): + af, socktype, proto, canonname, sa = res + sock = None + try: + sock = socket.socket(af, socktype, proto) + + # If provided, set socket level options before connecting. + _set_socket_options(sock, socket_options) + + if timeout is not _DEFAULT_TIMEOUT: + sock.settimeout(timeout) + if source_address: + sock.bind(source_address) +> sock.connect(sa) +E ConnectionRefusedError: [Errno 111] Connection refused + +/usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError + +The above exception was the direct cause of the following exception: + +self = +method = 'GET', url = '/', body = None +headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} +retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) +redirect = False, assert_same_host = False +timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None +release_conn = False, chunked = False, body_pos = None, preload_content = False +decode_content = False, response_kw = {} +parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) +destination_scheme = None, conn = None, release_this_conn = True +http_tunnel_required = False, err = None, clean_exit = False + + def urlopen( # type: ignore[override] + self, + method: str, + url: str, + body: _TYPE_BODY | None = None, + headers: typing.Mapping[str, str] | None = None, + retries: Retry | bool | int | None = None, + redirect: bool = True, + assert_same_host: bool = True, + timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, + pool_timeout: int | None = None, + release_conn: bool | None = None, + chunked: bool = False, + body_pos: _TYPE_BODY_POSITION | None = None, + preload_content: bool = True, + decode_content: bool = True, + **response_kw: typing.Any, + ) -> BaseHTTPResponse: + """ + Get a connection from the pool and perform an HTTP request. This is the + lowest level call for making a request, so you'll need to specify all + the raw details. + + .. note:: + + More commonly, it's appropriate to use a convenience method + such as :meth:`request`. + + .. note:: + + `release_conn` will only behave as expected if + `preload_content=False` because we want to make + `preload_content=False` the default behaviour someday soon without + breaking backwards compatibility. + + :param method: + HTTP request method (such as GET, POST, PUT, etc.) + + :param url: + The URL to perform the request on. + + :param body: + Data to send in the request body, either :class:`str`, :class:`bytes`, + an iterable of :class:`str`/:class:`bytes`, or a file-like object. + + :param headers: + Dictionary of custom headers to send, such as User-Agent, + If-None-Match, etc. If None, pool headers are used. If provided, + these headers completely replace any pool-specific headers. + + :param retries: + Configure the number of retries to allow before raising a + :class:`~urllib3.exceptions.MaxRetryError` exception. + + Pass ``None`` to retry until you receive a response. Pass a + :class:`~urllib3.util.retry.Retry` object for fine-grained control + over different types of retries. + Pass an integer number to retry connection errors that many times, + but no other types of errors. Pass zero to never retry. + + If ``False``, then retries are disabled and any exception is raised + immediately. Also, instead of raising a MaxRetryError on redirects, + the redirect response will be returned. + + :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. + + :param redirect: + If True, automatically handle redirects (status codes 301, 302, + 303, 307, 308). Each redirect counts as a retry. Disabling retries + will disable redirect, too. + + :param assert_same_host: + If ``True``, will make sure that the host of the pool requests is + consistent else will raise HostChangedError. When ``False``, you can + use the pool on an HTTP proxy and request foreign hosts. + + :param timeout: + If specified, overrides the default timeout for this one + request. It may be a float (in seconds) or an instance of + :class:`urllib3.util.Timeout`. + + :param pool_timeout: + If set and the pool is set to block=True, then this method will + block for ``pool_timeout`` seconds and raise EmptyPoolError if no + connection is available within the time period. + + :param bool preload_content: + If True, the response's body will be preloaded into memory. + + :param bool decode_content: + If True, will attempt to decode the body based on the + 'content-encoding' header. + + :param release_conn: + If False, then the urlopen call will not release the connection + back into the pool once a response is received (but will release if + you read the entire contents of the response such as when + `preload_content=True`). This is useful if you're not preloading + the response's content immediately. You will need to call + ``r.release_conn()`` on the response ``r`` to return the connection + back into the pool. If None, it takes the value of ``preload_content`` + which defaults to ``True``. + + :param bool chunked: + If True, urllib3 will send the body using chunked transfer + encoding. Otherwise, urllib3 will send the body using the standard + content-length form. Defaults to False. + + :param int body_pos: + Position to seek to in file-like body in the event of a retry or + redirect. Typically this won't need to be set because urllib3 will + auto-populate the value when needed. + """ + parsed_url = parse_url(url) + destination_scheme = parsed_url.scheme + + if headers is None: + headers = self.headers + + if not isinstance(retries, Retry): + retries = Retry.from_int(retries, redirect=redirect, default=self.retries) + + if release_conn is None: + release_conn = preload_content + + # Check host + if assert_same_host and not self.is_same_host(url): + raise HostChangedError(self, url, retries) + + # Ensure that the URL we're connecting to is properly encoded + if url.startswith("/"): + url = to_str(_encode_target(url)) + else: + url = to_str(parsed_url.url) + + conn = None + + # Track whether `conn` needs to be released before + # returning/raising/recursing. Update this variable if necessary, and + # leave `release_conn` constant throughout the function. That way, if + # the function recurses, the original value of `release_conn` will be + # passed down into the recursive call, and its value will be respected. + # + # See issue #651 [1] for details. + # + # [1] + release_this_conn = release_conn + + http_tunnel_required = connection_requires_http_tunnel( + self.proxy, self.proxy_config, destination_scheme + ) + + # Merge the proxy headers. Only done when not using HTTP CONNECT. We + # have to copy the headers dict so we can safely change it without those + # changes being reflected in anyone else's copy. + if not http_tunnel_required: + headers = headers.copy() # type: ignore[attr-defined] + headers.update(self.proxy_headers) # type: ignore[union-attr] + + # Must keep the exception bound to a separate variable or else Python 3 + # complains about UnboundLocalError. + err = None + + # Keep track of whether we cleanly exited the except block. This + # ensures we do proper cleanup in finally. + clean_exit = False + + # Rewind body position, if needed. Record current position + # for future rewinds in the event of a redirect/retry. + body_pos = set_file_position(body, body_pos) + + try: + # Request a connection from the queue. + timeout_obj = self._get_timeout(timeout) + conn = self._get_conn(timeout=pool_timeout) + + conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] + + # Is this a closed/new connection that requires CONNECT tunnelling? + if self.proxy is not None and http_tunnel_required and conn.is_closed: + try: + self._prepare_proxy(conn) + except (BaseSSLError, OSError, SocketTimeout) as e: + self._raise_timeout( + err=e, url=self.proxy.url, timeout_value=conn.timeout + ) + raise + + # If we're going to release the connection in ``finally:``, then + # the response doesn't need to know about the connection. Otherwise + # it will also try to release it and we'll have a double-release + # mess. + response_conn = conn if not release_conn else None + + # Make the request on the HTTPConnection object +> response = self._make_request( + conn, + method, + url, + timeout=timeout_obj, + body=body, + headers=headers, + chunked=chunked, + retries=retries, + response_conn=response_conn, + preload_content=preload_content, + decode_content=decode_content, + **response_kw, + ) + +/usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ +/usr/lib/python3/dist-packages/urllib3/connectionpool.py:492: in _make_request + raise new_e +/usr/lib/python3/dist-packages/urllib3/connectionpool.py:468: in _make_request + self._validate_conn(conn) +/usr/lib/python3/dist-packages/urllib3/connectionpool.py:1097: in _validate_conn + conn.connect() +/usr/lib/python3/dist-packages/urllib3/connection.py:611: in connect + self.sock = sock = self._new_conn() +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ + +self = + + def _new_conn(self) -> socket.socket: + """Establish a socket connection and set nodelay settings on it. + + :return: New socket connection. + """ + try: + sock = connection.create_connection( + (self._dns_host, self.port), + self.timeout, + source_address=self.source_address, + socket_options=self.socket_options, + ) + except socket.gaierror as e: + raise NameResolutionError(self.host, self, e) from e + except SocketTimeout as e: + raise ConnectTimeoutError( + self, + f"Connection to {self.host} timed out. (connect timeout={self.timeout})", + ) from e + + except OSError as e: +> raise NewConnectionError( + self, f"Failed to establish a new connection: {e}" + ) from e +E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused + +/usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError + +The above exception was the direct cause of the following exception: + +self = +request = , stream = False +timeout = Timeout(connect=None, read=None, total=None), verify = False +cert = None, proxies = OrderedDict() + + def send( + self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None + ): + """Sends PreparedRequest object. Returns Response object. + + :param request: The :class:`PreparedRequest ` being sent. + :param stream: (optional) Whether to stream the request content. + :param timeout: (optional) How long to wait for the server to send + data before giving up, as a float, or a :ref:`(connect timeout, + read timeout) ` tuple. + :type timeout: float or tuple or urllib3 Timeout object + :param verify: (optional) Either a boolean, in which case it controls whether + we verify the server's TLS certificate, or a string, in which case it + must be a path to a CA bundle to use + :param cert: (optional) Any user-provided SSL certificate to be trusted. + :param proxies: (optional) The proxies dictionary to apply to the request. + :rtype: requests.Response + """ + + try: + conn = self.get_connection_with_tls_context( + request, verify, proxies=proxies, cert=cert + ) + except LocationValueError as e: + raise InvalidURL(e, request=request) + + self.cert_verify(conn, request.url, verify, cert) + url = self.request_url(request, proxies) + self.add_headers( + request, + stream=stream, + timeout=timeout, + verify=verify, + cert=cert, + proxies=proxies, + ) + + chunked = not (request.body is None or "Content-Length" in request.headers) + + if isinstance(timeout, tuple): + try: + connect, read = timeout + timeout = TimeoutSauce(connect=connect, read=read) + except ValueError: + raise ValueError( + f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " + f"or a single float to set both timeouts to the same value." + ) + elif isinstance(timeout, TimeoutSauce): + pass + else: + timeout = TimeoutSauce(connect=timeout, read=timeout) + + try: +> resp = conn.urlopen( + method=request.method, + url=url, + body=request.body, + headers=request.headers, + redirect=False, + assert_same_host=False, + preload_content=False, + decode_content=False, + retries=self.max_retries, + timeout=timeout, + chunked=chunked, + ) + +/usr/lib/python3/dist-packages/requests/adapters.py:667: +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ +/usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen + retries = retries.increment( +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ + +self = Retry(total=0, connect=None, read=False, redirect=None, status=None) +method = 'GET', url = '/', response = None +error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') +_pool = +_stacktrace = + + def increment( + self, + method: str | None = None, + url: str | None = None, + response: BaseHTTPResponse | None = None, + error: Exception | None = None, + _pool: ConnectionPool | None = None, + _stacktrace: TracebackType | None = None, + ) -> Retry: + """Return a new Retry object with incremented retry counters. + + :param response: A response object, or None, if the server did not + return a response. + :type response: :class:`~urllib3.response.BaseHTTPResponse` + :param Exception error: An error encountered during the request, or + None if the response was received successfully. + + :return: A new ``Retry`` object. + """ + if self.total is False and error: + # Disabled, indicate to re-raise the error. + raise reraise(type(error), error, _stacktrace) + + total = self.total + if total is not None: + total -= 1 + + connect = self.connect + read = self.read + redirect = self.redirect + status_count = self.status + other = self.other + cause = "unknown" + status = None + redirect_location = None + + if error and self._is_connection_error(error): + # Connect retry? + if connect is False: + raise reraise(type(error), error, _stacktrace) + elif connect is not None: + connect -= 1 + + elif error and self._is_read_error(error): + # Read retry? + if read is False or method is None or not self._is_method_retryable(method): + raise reraise(type(error), error, _stacktrace) + elif read is not None: + read -= 1 + + elif error: + # Other retry? + if other is not None: + other -= 1 + + elif response and response.get_redirect_location(): + # Redirect retry? + if redirect is not None: + redirect -= 1 + cause = "too many redirects" + response_redirect_location = response.get_redirect_location() + if response_redirect_location: + redirect_location = response_redirect_location + status = response.status + + else: + # Incrementing because of a server error like a 500 in + # status_forcelist and the given method is in the allowed_methods + cause = ResponseError.GENERIC_ERROR + if response and response.status: + if status_count is not None: + status_count -= 1 + cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) + status = response.status + + history = self.history + ( + RequestHistory(method, url, error, status, redirect_location), + ) + + new_retry = self.new( + total=total, + connect=connect, + read=read, + redirect=redirect, + status=status_count, + other=other, + history=history, + ) + + if new_retry.is_exhausted(): + reason = error or ResponseError(cause) +> raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] +E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='localhost', port=58835): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) + +/usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError + +During handling of the above exception, another exception occurred: + +run_servefile = ._run_servefile at 0xb5c54118> +datadir = ._datadir at 0xb5c546b8> + + def test_https(run_servefile, datadir): + data = "NOOT NOOT" + p = datadir({'testfile': data}) / 'testfile' + run_servefile(['--ssl', str(p)]) + + # fingerprint = None + # while not fingerprint: + # line = s.stdout.readline() + # print(line) + # # if we find this line we went too far... + # assert not line.startswith("Some addresses this file will be available at") + + # if line.startswith("SHA1 fingerprint"): + # fingerprint = line.replace("SHA1 fingerprint: ", "").strip() + # break + + # assert fingerprint + urllib3.disable_warnings() +> _retry_while(ConnectionError, check_download)(data, protocol='https', verify=False) + +tests/test_servefile.py:427: +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ +tests/test_servefile.py:127: in wrapped + return function(*args, **kwargs) +tests/test_servefile.py:109: in check_download + r = make_request(path, **kwargs) +tests/test_servefile.py:98: in make_request + r = getattr(requests, method)(url, **kwargs) +/usr/lib/python3/dist-packages/requests/api.py:73: in get + return request("get", url, params=params, **kwargs) +/usr/lib/python3/dist-packages/requests/api.py:59: in request + return session.request(method=method, url=url, **kwargs) +/usr/lib/python3/dist-packages/requests/sessions.py:589: in request + resp = self.send(prep, **send_kwargs) +/usr/lib/python3/dist-packages/requests/sessions.py:703: in send + r = adapter.send(request, **kwargs) +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ + +self = +request = , stream = False +timeout = Timeout(connect=None, read=None, total=None), verify = False +cert = None, proxies = OrderedDict() + + def send( + self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None + ): + """Sends PreparedRequest object. Returns Response object. + + :param request: The :class:`PreparedRequest ` being sent. + :param stream: (optional) Whether to stream the request content. + :param timeout: (optional) How long to wait for the server to send + data before giving up, as a float, or a :ref:`(connect timeout, + read timeout) ` tuple. + :type timeout: float or tuple or urllib3 Timeout object + :param verify: (optional) Either a boolean, in which case it controls whether + we verify the server's TLS certificate, or a string, in which case it + must be a path to a CA bundle to use + :param cert: (optional) Any user-provided SSL certificate to be trusted. + :param proxies: (optional) The proxies dictionary to apply to the request. + :rtype: requests.Response + """ + + try: + conn = self.get_connection_with_tls_context( + request, verify, proxies=proxies, cert=cert + ) + except LocationValueError as e: + raise InvalidURL(e, request=request) + + self.cert_verify(conn, request.url, verify, cert) + url = self.request_url(request, proxies) + self.add_headers( + request, + stream=stream, + timeout=timeout, + verify=verify, + cert=cert, + proxies=proxies, + ) + + chunked = not (request.body is None or "Content-Length" in request.headers) + + if isinstance(timeout, tuple): + try: + connect, read = timeout + timeout = TimeoutSauce(connect=connect, read=read) + except ValueError: + raise ValueError( + f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " + f"or a single float to set both timeouts to the same value." + ) + elif isinstance(timeout, TimeoutSauce): + pass + else: + timeout = TimeoutSauce(connect=timeout, read=timeout) + + try: + resp = conn.urlopen( + method=request.method, + url=url, + body=request.body, + headers=request.headers, + redirect=False, + assert_same_host=False, + preload_content=False, + decode_content=False, + retries=self.max_retries, + timeout=timeout, + chunked=chunked, + ) + + except (ProtocolError, OSError) as err: + raise ConnectionError(err, request=request) + + except MaxRetryError as e: + if isinstance(e.reason, ConnectTimeoutError): + # TODO: Remove this in 3.0.0: see #2811 + if not isinstance(e.reason, NewConnectionError): + raise ConnectTimeout(e, request=request) + + if isinstance(e.reason, ResponseError): + raise RetryError(e, request=request) + + if isinstance(e.reason, _ProxyError): + raise ProxyError(e, request=request) + + if isinstance(e.reason, _SSLError): + # This branch is for urllib3 v1.22 and later. + raise SSLError(e, request=request) + +> raise ConnectionError(e, request=request) +E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='localhost', port=58835): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) + +/usr/lib/python3/dist-packages/requests/adapters.py:700: ConnectionError +----------------------------- Captured stdout call ----------------------------- +running -m, servefile with args ['--ssl', '/tmp/pytest-of-pbuilder2/pytest-0/test_https0/testfile', '-p', '58835'] +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Generating SSL certificate...Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +___________________________ test_https_big_download ____________________________ + +self = + + def _new_conn(self) -> socket.socket: + """Establish a socket connection and set nodelay settings on it. + + :return: New socket connection. + """ + try: +> sock = connection.create_connection( + (self._dns_host, self.port), + self.timeout, + source_address=self.source_address, + socket_options=self.socket_options, + ) + +/usr/lib/python3/dist-packages/urllib3/connection.py:203: +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ +/usr/lib/python3/dist-packages/urllib3/util/connection.py:85: in create_connection + raise err +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ + +address = ('localhost', 58835), timeout = None, source_address = None +socket_options = [(6, 1, 1)] + + def create_connection( + address: tuple[str, int], + timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, + source_address: tuple[str, int] | None = None, + socket_options: _TYPE_SOCKET_OPTIONS | None = None, + ) -> socket.socket: + """Connect to *address* and return the socket object. + + Convenience function. Connect to *address* (a 2-tuple ``(host, + port)``) and return the socket object. Passing the optional + *timeout* parameter will set the timeout on the socket instance + before attempting to connect. If no *timeout* is supplied, the + global default timeout setting returned by :func:`socket.getdefaulttimeout` + is used. If *source_address* is set it must be a tuple of (host, port) + for the socket to bind as a source address before making the connection. + An host of '' or port 0 tells the OS to use the default. + """ + + host, port = address + if host.startswith("["): + host = host.strip("[]") + err = None + + # Using the value from allowed_gai_family() in the context of getaddrinfo lets + # us select whether to work with IPv4 DNS records, IPv6 records, or both. + # The original create_connection function always returns all records. + family = allowed_gai_family() + + try: + host.encode("idna") + except UnicodeError: + raise LocationParseError(f"'{host}', label empty or too long") from None + + for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): + af, socktype, proto, canonname, sa = res + sock = None + try: + sock = socket.socket(af, socktype, proto) + + # If provided, set socket level options before connecting. + _set_socket_options(sock, socket_options) + + if timeout is not _DEFAULT_TIMEOUT: + sock.settimeout(timeout) + if source_address: + sock.bind(source_address) +> sock.connect(sa) +E ConnectionRefusedError: [Errno 111] Connection refused + +/usr/lib/python3/dist-packages/urllib3/util/connection.py:73: ConnectionRefusedError + +The above exception was the direct cause of the following exception: + +self = +method = 'GET', url = '/', body = None +headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive'} +retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) +redirect = False, assert_same_host = False +timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None +release_conn = False, chunked = False, body_pos = None, preload_content = False +decode_content = False, response_kw = {} +parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/', query=None, fragment=None) +destination_scheme = None, conn = None, release_this_conn = True +http_tunnel_required = False, err = None, clean_exit = False + + def urlopen( # type: ignore[override] + self, + method: str, + url: str, + body: _TYPE_BODY | None = None, + headers: typing.Mapping[str, str] | None = None, + retries: Retry | bool | int | None = None, + redirect: bool = True, + assert_same_host: bool = True, + timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, + pool_timeout: int | None = None, + release_conn: bool | None = None, + chunked: bool = False, + body_pos: _TYPE_BODY_POSITION | None = None, + preload_content: bool = True, + decode_content: bool = True, + **response_kw: typing.Any, + ) -> BaseHTTPResponse: + """ + Get a connection from the pool and perform an HTTP request. This is the + lowest level call for making a request, so you'll need to specify all + the raw details. + + .. note:: + + More commonly, it's appropriate to use a convenience method + such as :meth:`request`. + + .. note:: + + `release_conn` will only behave as expected if + `preload_content=False` because we want to make + `preload_content=False` the default behaviour someday soon without + breaking backwards compatibility. + + :param method: + HTTP request method (such as GET, POST, PUT, etc.) + + :param url: + The URL to perform the request on. + + :param body: + Data to send in the request body, either :class:`str`, :class:`bytes`, + an iterable of :class:`str`/:class:`bytes`, or a file-like object. + + :param headers: + Dictionary of custom headers to send, such as User-Agent, + If-None-Match, etc. If None, pool headers are used. If provided, + these headers completely replace any pool-specific headers. + + :param retries: + Configure the number of retries to allow before raising a + :class:`~urllib3.exceptions.MaxRetryError` exception. + + Pass ``None`` to retry until you receive a response. Pass a + :class:`~urllib3.util.retry.Retry` object for fine-grained control + over different types of retries. + Pass an integer number to retry connection errors that many times, + but no other types of errors. Pass zero to never retry. + + If ``False``, then retries are disabled and any exception is raised + immediately. Also, instead of raising a MaxRetryError on redirects, + the redirect response will be returned. + + :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. + + :param redirect: + If True, automatically handle redirects (status codes 301, 302, + 303, 307, 308). Each redirect counts as a retry. Disabling retries + will disable redirect, too. + + :param assert_same_host: + If ``True``, will make sure that the host of the pool requests is + consistent else will raise HostChangedError. When ``False``, you can + use the pool on an HTTP proxy and request foreign hosts. + + :param timeout: + If specified, overrides the default timeout for this one + request. It may be a float (in seconds) or an instance of + :class:`urllib3.util.Timeout`. + + :param pool_timeout: + If set and the pool is set to block=True, then this method will + block for ``pool_timeout`` seconds and raise EmptyPoolError if no + connection is available within the time period. + + :param bool preload_content: + If True, the response's body will be preloaded into memory. + + :param bool decode_content: + If True, will attempt to decode the body based on the + 'content-encoding' header. + + :param release_conn: + If False, then the urlopen call will not release the connection + back into the pool once a response is received (but will release if + you read the entire contents of the response such as when + `preload_content=True`). This is useful if you're not preloading + the response's content immediately. You will need to call + ``r.release_conn()`` on the response ``r`` to return the connection + back into the pool. If None, it takes the value of ``preload_content`` + which defaults to ``True``. + + :param bool chunked: + If True, urllib3 will send the body using chunked transfer + encoding. Otherwise, urllib3 will send the body using the standard + content-length form. Defaults to False. + + :param int body_pos: + Position to seek to in file-like body in the event of a retry or + redirect. Typically this won't need to be set because urllib3 will + auto-populate the value when needed. + """ + parsed_url = parse_url(url) + destination_scheme = parsed_url.scheme + + if headers is None: + headers = self.headers + + if not isinstance(retries, Retry): + retries = Retry.from_int(retries, redirect=redirect, default=self.retries) + + if release_conn is None: + release_conn = preload_content + + # Check host + if assert_same_host and not self.is_same_host(url): + raise HostChangedError(self, url, retries) + + # Ensure that the URL we're connecting to is properly encoded + if url.startswith("/"): + url = to_str(_encode_target(url)) + else: + url = to_str(parsed_url.url) + + conn = None + + # Track whether `conn` needs to be released before + # returning/raising/recursing. Update this variable if necessary, and + # leave `release_conn` constant throughout the function. That way, if + # the function recurses, the original value of `release_conn` will be + # passed down into the recursive call, and its value will be respected. + # + # See issue #651 [1] for details. + # + # [1] + release_this_conn = release_conn + + http_tunnel_required = connection_requires_http_tunnel( + self.proxy, self.proxy_config, destination_scheme + ) + + # Merge the proxy headers. Only done when not using HTTP CONNECT. We + # have to copy the headers dict so we can safely change it without those + # changes being reflected in anyone else's copy. + if not http_tunnel_required: + headers = headers.copy() # type: ignore[attr-defined] + headers.update(self.proxy_headers) # type: ignore[union-attr] + + # Must keep the exception bound to a separate variable or else Python 3 + # complains about UnboundLocalError. + err = None + + # Keep track of whether we cleanly exited the except block. This + # ensures we do proper cleanup in finally. + clean_exit = False + + # Rewind body position, if needed. Record current position + # for future rewinds in the event of a redirect/retry. + body_pos = set_file_position(body, body_pos) + + try: + # Request a connection from the queue. + timeout_obj = self._get_timeout(timeout) + conn = self._get_conn(timeout=pool_timeout) + + conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] + + # Is this a closed/new connection that requires CONNECT tunnelling? + if self.proxy is not None and http_tunnel_required and conn.is_closed: + try: + self._prepare_proxy(conn) + except (BaseSSLError, OSError, SocketTimeout) as e: + self._raise_timeout( + err=e, url=self.proxy.url, timeout_value=conn.timeout + ) + raise + + # If we're going to release the connection in ``finally:``, then + # the response doesn't need to know about the connection. Otherwise + # it will also try to release it and we'll have a double-release + # mess. + response_conn = conn if not release_conn else None + + # Make the request on the HTTPConnection object +> response = self._make_request( + conn, + method, + url, + timeout=timeout_obj, + body=body, + headers=headers, + chunked=chunked, + retries=retries, + response_conn=response_conn, + preload_content=preload_content, + decode_content=decode_content, + **response_kw, + ) + +/usr/lib/python3/dist-packages/urllib3/connectionpool.py:791: +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ +/usr/lib/python3/dist-packages/urllib3/connectionpool.py:492: in _make_request + raise new_e +/usr/lib/python3/dist-packages/urllib3/connectionpool.py:468: in _make_request + self._validate_conn(conn) +/usr/lib/python3/dist-packages/urllib3/connectionpool.py:1097: in _validate_conn + conn.connect() +/usr/lib/python3/dist-packages/urllib3/connection.py:611: in connect + self.sock = sock = self._new_conn() +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ + +self = + + def _new_conn(self) -> socket.socket: + """Establish a socket connection and set nodelay settings on it. + + :return: New socket connection. + """ + try: + sock = connection.create_connection( + (self._dns_host, self.port), + self.timeout, + source_address=self.source_address, + socket_options=self.socket_options, + ) + except socket.gaierror as e: + raise NameResolutionError(self.host, self, e) from e + except SocketTimeout as e: + raise ConnectTimeoutError( + self, + f"Connection to {self.host} timed out. (connect timeout={self.timeout})", + ) from e + + except OSError as e: +> raise NewConnectionError( + self, f"Failed to establish a new connection: {e}" + ) from e +E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused + +/usr/lib/python3/dist-packages/urllib3/connection.py:218: NewConnectionError + +The above exception was the direct cause of the following exception: + +self = +request = , stream = False +timeout = Timeout(connect=None, read=None, total=None), verify = False +cert = None, proxies = OrderedDict() + + def send( + self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None + ): + """Sends PreparedRequest object. Returns Response object. + + :param request: The :class:`PreparedRequest ` being sent. + :param stream: (optional) Whether to stream the request content. + :param timeout: (optional) How long to wait for the server to send + data before giving up, as a float, or a :ref:`(connect timeout, + read timeout) ` tuple. + :type timeout: float or tuple or urllib3 Timeout object + :param verify: (optional) Either a boolean, in which case it controls whether + we verify the server's TLS certificate, or a string, in which case it + must be a path to a CA bundle to use + :param cert: (optional) Any user-provided SSL certificate to be trusted. + :param proxies: (optional) The proxies dictionary to apply to the request. + :rtype: requests.Response + """ + + try: + conn = self.get_connection_with_tls_context( + request, verify, proxies=proxies, cert=cert + ) + except LocationValueError as e: + raise InvalidURL(e, request=request) + + self.cert_verify(conn, request.url, verify, cert) + url = self.request_url(request, proxies) + self.add_headers( + request, + stream=stream, + timeout=timeout, + verify=verify, + cert=cert, + proxies=proxies, + ) + + chunked = not (request.body is None or "Content-Length" in request.headers) + + if isinstance(timeout, tuple): + try: + connect, read = timeout + timeout = TimeoutSauce(connect=connect, read=read) + except ValueError: + raise ValueError( + f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " + f"or a single float to set both timeouts to the same value." + ) + elif isinstance(timeout, TimeoutSauce): + pass + else: + timeout = TimeoutSauce(connect=timeout, read=timeout) + + try: +> resp = conn.urlopen( + method=request.method, + url=url, + body=request.body, + headers=request.headers, + redirect=False, + assert_same_host=False, + preload_content=False, + decode_content=False, + retries=self.max_retries, + timeout=timeout, + chunked=chunked, + ) + +/usr/lib/python3/dist-packages/requests/adapters.py:667: +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ +/usr/lib/python3/dist-packages/urllib3/connectionpool.py:845: in urlopen + retries = retries.increment( +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ + +self = Retry(total=0, connect=None, read=False, redirect=None, status=None) +method = 'GET', url = '/', response = None +error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') +_pool = +_stacktrace = + + def increment( + self, + method: str | None = None, + url: str | None = None, + response: BaseHTTPResponse | None = None, + error: Exception | None = None, + _pool: ConnectionPool | None = None, + _stacktrace: TracebackType | None = None, + ) -> Retry: + """Return a new Retry object with incremented retry counters. + + :param response: A response object, or None, if the server did not + return a response. + :type response: :class:`~urllib3.response.BaseHTTPResponse` + :param Exception error: An error encountered during the request, or + None if the response was received successfully. + + :return: A new ``Retry`` object. + """ + if self.total is False and error: + # Disabled, indicate to re-raise the error. + raise reraise(type(error), error, _stacktrace) + + total = self.total + if total is not None: + total -= 1 + + connect = self.connect + read = self.read + redirect = self.redirect + status_count = self.status + other = self.other + cause = "unknown" + status = None + redirect_location = None + + if error and self._is_connection_error(error): + # Connect retry? + if connect is False: + raise reraise(type(error), error, _stacktrace) + elif connect is not None: + connect -= 1 + + elif error and self._is_read_error(error): + # Read retry? + if read is False or method is None or not self._is_method_retryable(method): + raise reraise(type(error), error, _stacktrace) + elif read is not None: + read -= 1 + + elif error: + # Other retry? + if other is not None: + other -= 1 + + elif response and response.get_redirect_location(): + # Redirect retry? + if redirect is not None: + redirect -= 1 + cause = "too many redirects" + response_redirect_location = response.get_redirect_location() + if response_redirect_location: + redirect_location = response_redirect_location + status = response.status + + else: + # Incrementing because of a server error like a 500 in + # status_forcelist and the given method is in the allowed_methods + cause = ResponseError.GENERIC_ERROR + if response and response.status: + if status_count is not None: + status_count -= 1 + cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) + status = response.status + + history = self.history + ( + RequestHistory(method, url, error, status, redirect_location), + ) + + new_retry = self.new( + total=total, + connect=connect, + read=read, + redirect=redirect, + status=status_count, + other=other, + history=history, + ) + + if new_retry.is_exhausted(): + reason = error or ResponseError(cause) +> raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] +E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='localhost', port=58835): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) + +/usr/lib/python3/dist-packages/urllib3/util/retry.py:515: MaxRetryError + +During handling of the above exception, another exception occurred: + +run_servefile = ._run_servefile at 0xb5be9988> +datadir = ._datadir at 0xb5ac8b68> + + def test_https_big_download(run_servefile, datadir): + # test with about 10 mb of data + data = "x" * (10 * 1024 ** 2) + p = datadir({'testfile': data}) / 'testfile' + run_servefile(['--ssl', str(p)]) + + urllib3.disable_warnings() +> _retry_while(ConnectionError, check_download)(data, protocol='https', verify=False) + +tests/test_servefile.py:437: +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ +tests/test_servefile.py:127: in wrapped + return function(*args, **kwargs) +tests/test_servefile.py:109: in check_download + r = make_request(path, **kwargs) +tests/test_servefile.py:98: in make_request + r = getattr(requests, method)(url, **kwargs) +/usr/lib/python3/dist-packages/requests/api.py:73: in get + return request("get", url, params=params, **kwargs) +/usr/lib/python3/dist-packages/requests/api.py:59: in request + return session.request(method=method, url=url, **kwargs) +/usr/lib/python3/dist-packages/requests/sessions.py:589: in request + resp = self.send(prep, **send_kwargs) +/usr/lib/python3/dist-packages/requests/sessions.py:703: in send + r = adapter.send(request, **kwargs) +_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ + +self = +request = , stream = False +timeout = Timeout(connect=None, read=None, total=None), verify = False +cert = None, proxies = OrderedDict() + + def send( + self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None + ): + """Sends PreparedRequest object. Returns Response object. + + :param request: The :class:`PreparedRequest ` being sent. + :param stream: (optional) Whether to stream the request content. + :param timeout: (optional) How long to wait for the server to send + data before giving up, as a float, or a :ref:`(connect timeout, + read timeout) ` tuple. + :type timeout: float or tuple or urllib3 Timeout object + :param verify: (optional) Either a boolean, in which case it controls whether + we verify the server's TLS certificate, or a string, in which case it + must be a path to a CA bundle to use + :param cert: (optional) Any user-provided SSL certificate to be trusted. + :param proxies: (optional) The proxies dictionary to apply to the request. + :rtype: requests.Response + """ + + try: + conn = self.get_connection_with_tls_context( + request, verify, proxies=proxies, cert=cert + ) + except LocationValueError as e: + raise InvalidURL(e, request=request) + + self.cert_verify(conn, request.url, verify, cert) + url = self.request_url(request, proxies) + self.add_headers( + request, + stream=stream, + timeout=timeout, + verify=verify, + cert=cert, + proxies=proxies, + ) + + chunked = not (request.body is None or "Content-Length" in request.headers) + + if isinstance(timeout, tuple): + try: + connect, read = timeout + timeout = TimeoutSauce(connect=connect, read=read) + except ValueError: + raise ValueError( + f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " + f"or a single float to set both timeouts to the same value." + ) + elif isinstance(timeout, TimeoutSauce): + pass + else: + timeout = TimeoutSauce(connect=timeout, read=timeout) + + try: + resp = conn.urlopen( + method=request.method, + url=url, + body=request.body, + headers=request.headers, + redirect=False, + assert_same_host=False, + preload_content=False, + decode_content=False, + retries=self.max_retries, + timeout=timeout, + chunked=chunked, + ) + + except (ProtocolError, OSError) as err: + raise ConnectionError(err, request=request) + + except MaxRetryError as e: + if isinstance(e.reason, ConnectTimeoutError): + # TODO: Remove this in 3.0.0: see #2811 + if not isinstance(e.reason, NewConnectionError): + raise ConnectTimeout(e, request=request) + + if isinstance(e.reason, ResponseError): + raise RetryError(e, request=request) + + if isinstance(e.reason, _ProxyError): + raise ProxyError(e, request=request) + + if isinstance(e.reason, _SSLError): + # This branch is for urllib3 v1.22 and later. + raise SSLError(e, request=request) + +> raise ConnectionError(e, request=request) +E requests.exceptions.ConnectionError: HTTPSConnectionPool(host='localhost', port=58835): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) + +/usr/lib/python3/dist-packages/requests/adapters.py:700: ConnectionError +----------------------------- Captured stdout call ----------------------------- +running -m, servefile with args ['--ssl', '/tmp/pytest-of-pbuilder2/pytest-0/test_https_big_download0/testfile', '-p', '58835'] +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Generating SSL certificate...Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +Calling get on https://localhost:58835/ with {'verify': False} +=========================== short test summary info ============================ +FAILED tests/test_servefile.py::test_https - requests.exceptions.ConnectionEr... +FAILED tests/test_servefile.py::test_https_big_download - requests.exceptions... +======================== 2 failed, 17 passed in 27.16s ========================= +E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /build/reproducible-path/servefile-0.5.4/.pybuild/cpython3_3.12_servefile/build; python3.12 -m pytest tests +dh_auto_test: error: pybuild --test -i python{version} -p 3.12 --test-pytest returned exit code 13 +make[1]: *** [debian/rules:11: override_dh_auto_test] Error 25 +make[1]: Leaving directory '/build/reproducible-path/servefile-0.5.4' +make: *** [debian/rules:8: binary] Error 2 +dpkg-buildpackage: error: debian/rules binary subprocess returned exit status 2 I: copying local configuration +E: Failed autobuilding of package +I: user script /srv/workspace/pbuilder/3599/tmp/hooks/C01_cleanup starting +debug output: disk usage on i-capture-the-hostname at Sat Nov 9 09:51:32 UTC 2024 +Filesystem Size Used Avail Use% Mounted on +tmpfs 2.0G 0 2.0G 0% /dev/shm + +I: user script /srv/workspace/pbuilder/3599/tmp/hooks/C01_cleanup finished I: unmounting dev/ptmx filesystem I: unmounting dev/pts filesystem I: unmounting dev/shm filesystem I: unmounting proc filesystem I: unmounting sys filesystem I: cleaning the build env -I: removing directory /srv/workspace/pbuilder/15679 and its subdirectories -I: Current time: Fri Nov 8 21:45:16 -12 2024 -I: pbuilder-time-stamp: 1731145516 +I: removing directory /srv/workspace/pbuilder/3599 and its subdirectories